Feb 17 14:05:18 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 14:05:18 crc restorecon[4666]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:18 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:05:19 crc restorecon[4666]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 14:05:19 crc kubenswrapper[4762]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:05:19 crc kubenswrapper[4762]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 14:05:19 crc kubenswrapper[4762]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:05:19 crc kubenswrapper[4762]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:05:19 crc kubenswrapper[4762]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 14:05:19 crc kubenswrapper[4762]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.848624 4762 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852706 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852766 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852772 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852776 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852781 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852786 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852791 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852803 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852807 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852811 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852815 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852819 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852823 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852827 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852835 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852840 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852843 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852847 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852851 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852865 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852877 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852881 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852885 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852891 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852895 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852900 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852904 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852908 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852912 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852916 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852920 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852929 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852933 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852937 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852943 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852951 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852958 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852963 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852968 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852972 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852976 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852980 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852985 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852991 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852995 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.852999 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853003 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853006 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853010 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853014 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853018 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853021 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853026 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853030 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853035 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853039 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853046 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853052 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853056 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853059 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853063 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853071 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853075 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853080 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853083 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853087 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853094 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853098 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853102 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853106 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.853109 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854216 4762 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854239 4762 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854253 4762 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854261 4762 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854271 4762 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854284 4762 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854295 4762 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854304 4762 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854316 4762 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854322 4762 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854329 4762 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854336 4762 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854342 4762 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854348 4762 flags.go:64] FLAG: --cgroup-root="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854364 4762 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854369 4762 flags.go:64] FLAG: --client-ca-file="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854374 4762 flags.go:64] FLAG: --cloud-config="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854599 4762 flags.go:64] FLAG: --cloud-provider="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854609 4762 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854617 4762 flags.go:64] FLAG: --cluster-domain="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854622 4762 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854627 4762 flags.go:64] FLAG: --config-dir="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854631 4762 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854659 4762 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854668 4762 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854673 4762 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854679 4762 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854685 4762 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854690 4762 flags.go:64] FLAG: --contention-profiling="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854694 4762 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854699 4762 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854704 4762 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854708 4762 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854715 4762 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854721 4762 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854726 4762 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854730 4762 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854735 4762 flags.go:64] FLAG: --enable-server="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854739 4762 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854747 4762 flags.go:64] FLAG: --event-burst="100" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854752 4762 flags.go:64] FLAG: --event-qps="50" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854757 4762 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854762 4762 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854766 4762 flags.go:64] FLAG: --eviction-hard="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854771 4762 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854776 4762 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854781 4762 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854786 4762 flags.go:64] FLAG: --eviction-soft="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854791 4762 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854795 4762 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854800 4762 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854804 4762 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854808 4762 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854812 4762 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854816 4762 flags.go:64] FLAG: --feature-gates="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854823 4762 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854827 4762 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854832 4762 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854839 4762 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854844 4762 flags.go:64] FLAG: --healthz-port="10248" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854848 4762 flags.go:64] FLAG: --help="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854853 4762 flags.go:64] FLAG: --hostname-override="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854857 4762 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854862 4762 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854866 4762 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854871 4762 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854875 4762 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854880 4762 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854884 4762 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854888 4762 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854892 4762 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854897 4762 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854902 4762 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854906 4762 flags.go:64] FLAG: --kube-reserved="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854910 4762 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854915 4762 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854919 4762 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854924 4762 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854928 4762 flags.go:64] FLAG: --lock-file="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854932 4762 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854936 4762 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854941 4762 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854949 4762 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854954 4762 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854959 4762 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854963 4762 flags.go:64] FLAG: --logging-format="text" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854967 4762 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854972 4762 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854976 4762 flags.go:64] FLAG: --manifest-url="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854980 4762 flags.go:64] FLAG: --manifest-url-header="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854989 4762 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854993 4762 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.854999 4762 flags.go:64] FLAG: --max-pods="110" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855004 4762 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855008 4762 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855013 4762 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855017 4762 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855022 4762 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855027 4762 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855031 4762 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855047 4762 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855052 4762 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855057 4762 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855062 4762 flags.go:64] FLAG: --pod-cidr="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855066 4762 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855074 4762 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855079 4762 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855084 4762 flags.go:64] FLAG: --pods-per-core="0" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855088 4762 flags.go:64] FLAG: --port="10250" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855092 4762 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855096 4762 flags.go:64] FLAG: --provider-id="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855100 4762 flags.go:64] FLAG: --qos-reserved="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855105 4762 flags.go:64] FLAG: --read-only-port="10255" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855109 4762 flags.go:64] FLAG: --register-node="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855113 4762 flags.go:64] FLAG: --register-schedulable="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855118 4762 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855127 4762 flags.go:64] FLAG: --registry-burst="10" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855132 4762 flags.go:64] FLAG: --registry-qps="5" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855136 4762 flags.go:64] FLAG: --reserved-cpus="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855141 4762 flags.go:64] FLAG: --reserved-memory="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855148 4762 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855152 4762 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855158 4762 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855162 4762 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855167 4762 flags.go:64] FLAG: --runonce="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855171 4762 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855175 4762 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855179 4762 flags.go:64] FLAG: --seccomp-default="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855184 4762 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855188 4762 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855192 4762 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855197 4762 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855202 4762 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855206 4762 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855210 4762 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855214 4762 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855218 4762 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855223 4762 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855227 4762 flags.go:64] FLAG: --system-cgroups="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855231 4762 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855241 4762 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855245 4762 flags.go:64] FLAG: --tls-cert-file="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855249 4762 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855255 4762 flags.go:64] FLAG: --tls-min-version="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855259 4762 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855263 4762 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855267 4762 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855271 4762 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855275 4762 flags.go:64] FLAG: --v="2" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855290 4762 flags.go:64] FLAG: --version="false" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855297 4762 flags.go:64] FLAG: --vmodule="" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855302 4762 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.855307 4762 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855447 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855454 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855459 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855463 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855467 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855472 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855477 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855482 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855486 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855491 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855495 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855500 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855504 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855510 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855516 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855521 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855526 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855530 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855535 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855541 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855546 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855549 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855553 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855557 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855560 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855564 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855568 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855571 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855575 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855578 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855582 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855585 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855590 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855594 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855598 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855602 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855606 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855609 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855614 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855617 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855621 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855624 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855627 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855631 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855635 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855661 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855665 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855670 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855675 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855679 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855683 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855686 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855690 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855694 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855697 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855701 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855704 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855708 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855712 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855715 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855719 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855722 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855725 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855729 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855736 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855740 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855744 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855748 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855751 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855755 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.855758 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.856547 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.865757 4762 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.865809 4762 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865892 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865902 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865910 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865915 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865919 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865922 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865926 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865930 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865934 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865937 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865941 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865944 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865948 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865953 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865958 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865962 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865966 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865970 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865973 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865977 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865981 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865985 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865988 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865992 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865996 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.865999 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866003 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866008 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866012 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866017 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866021 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866026 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866030 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866035 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866040 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866046 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866050 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866055 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866061 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866067 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866072 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866079 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866086 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866091 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866096 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866101 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866105 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866110 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866114 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866119 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866123 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866127 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866132 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866138 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866143 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866149 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866153 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866158 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866162 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866166 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866171 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866175 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866179 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866182 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866187 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866190 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866194 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866197 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866201 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866204 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866209 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.866216 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866329 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866334 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866339 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866342 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866346 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866350 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866354 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866357 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866361 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866364 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866368 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866371 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866375 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866378 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866382 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866385 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866390 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866395 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866399 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866403 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866407 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866411 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866415 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866419 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866422 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866426 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866430 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866433 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866437 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866440 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866443 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866447 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866450 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866454 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866458 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866462 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866465 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866468 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866472 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866475 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866479 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866482 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866486 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866491 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866495 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866499 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866503 4762 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866507 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866512 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866516 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866520 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866525 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866529 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866533 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866538 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866542 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866547 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866552 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866556 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866560 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866565 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866569 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866575 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866581 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866586 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866591 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866596 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866600 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866604 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866609 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.866616 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.866624 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.867879 4762 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.873113 4762 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.874110 4762 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.875892 4762 server.go:997] "Starting client certificate rotation" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.875926 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.878563 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-03 16:11:56.727170595 +0000 UTC Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.878761 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.904357 4762 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.909419 4762 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:05:19 crc kubenswrapper[4762]: E0217 14:05:19.910241 4762 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.926382 4762 log.go:25] "Validated CRI v1 runtime API" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.959160 4762 log.go:25] "Validated CRI v1 image API" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.961168 4762 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.965597 4762 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-14-00-56-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.965639 4762 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.985263 4762 manager.go:217] Machine: {Timestamp:2026-02-17 14:05:19.98102132 +0000 UTC m=+0.561021982 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f4e79948-4d35-4f10-94ee-0c0db8bd23cc BootID:0948f442-754f-492a-b255-7c21a6e922d3 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:ed:9a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:ed:9a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f0:67:98 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:77:61:42 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b3:47:8b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:36:45:42 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:7e:71:c7:82:eb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:a3:13:1d:eb:2b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.985588 4762 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.985930 4762 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.986310 4762 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.986519 4762 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.986575 4762 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.986859 4762 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.986878 4762 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.987454 4762 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.987494 4762 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.987782 4762 state_mem.go:36] "Initialized new in-memory state store" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.987883 4762 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.992588 4762 kubelet.go:418] "Attempting to sync node with API server" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.992613 4762 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.992629 4762 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.992732 4762 kubelet.go:324] "Adding apiserver pod source" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.992754 4762 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.997952 4762 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.998273 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:19 crc kubenswrapper[4762]: E0217 14:05:19.998397 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:19 crc kubenswrapper[4762]: W0217 14:05:19.998371 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:19 crc kubenswrapper[4762]: E0217 14:05:19.998482 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:19 crc kubenswrapper[4762]: I0217 14:05:19.999011 4762 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.001051 4762 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002668 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002700 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002713 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002725 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002744 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002756 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002770 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002789 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002804 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002818 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002835 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.002847 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.003946 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.004609 4762 server.go:1280] "Started kubelet" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.005883 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.006098 4762 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.006085 4762 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 14:05:20 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.007277 4762 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008375 4762 server.go:460] "Adding debug handlers to kubelet server" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008487 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008778 4762 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008848 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:06:23.051437532 +0000 UTC Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008914 4762 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008985 4762 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.013535 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.008925 4762 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.016909 4762 factory.go:55] Registering systemd factory Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.017390 4762 factory.go:221] Registration of the systemd container factory successfully Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.017302 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="200ms" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.016771 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.017671 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.018238 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.214:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950db9c755d6d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:05:20.004560597 +0000 UTC m=+0.584561259,LastTimestamp:2026-02-17 14:05:20.004560597 +0000 UTC m=+0.584561259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.024228 4762 factory.go:153] Registering CRI-O factory Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.024285 4762 factory.go:221] Registration of the crio container factory successfully Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.024390 4762 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.024434 4762 factory.go:103] Registering Raw factory Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.024452 4762 manager.go:1196] Started watching for new ooms in manager Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.025296 4762 manager.go:319] Starting recovery of all containers Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033549 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033631 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033665 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033677 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033690 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033703 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033715 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033749 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033764 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033806 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033817 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033829 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033844 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033857 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033870 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.033882 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.036433 4762 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.036753 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.036898 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037003 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037113 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037234 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037370 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037478 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037582 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037776 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.037895 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038109 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038232 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038351 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038542 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038620 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038664 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038685 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038699 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038716 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038733 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038748 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038764 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038780 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038795 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038812 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038828 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038844 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038858 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038879 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038894 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038909 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038924 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038942 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038958 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038980 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.038998 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039019 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039036 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039055 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039071 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039088 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039124 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039143 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039158 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039171 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039185 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039198 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039218 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039232 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039247 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039301 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039317 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039330 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039344 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039359 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039373 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039387 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039402 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039415 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039429 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039449 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039462 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039479 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039494 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039507 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039522 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039545 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039567 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039584 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039598 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039621 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039637 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039672 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039688 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039706 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039722 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039736 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039750 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039767 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039782 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039795 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039808 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039849 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039867 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039882 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039895 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039909 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039947 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.039981 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040588 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040615 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040629 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040658 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040673 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040688 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040705 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040719 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040732 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040745 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040766 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040783 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040808 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040823 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040836 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040861 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040874 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040888 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040902 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040918 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040933 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040947 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040961 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040975 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.040989 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041009 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041029 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041073 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041087 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041099 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041112 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041128 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041145 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041159 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041174 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041195 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041209 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041226 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041240 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041255 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041275 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041286 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041313 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041330 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041342 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041353 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041364 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041376 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041386 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041396 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041406 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041419 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041612 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041626 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041639 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041744 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041758 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041783 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041799 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041812 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041823 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041835 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041852 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041869 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041880 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041891 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041903 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041914 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041924 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041934 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041945 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041956 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041969 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041982 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.041998 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.042011 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.046869 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.047289 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.048582 4762 manager.go:324] Recovery completed Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.048719 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049096 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049164 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049191 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049207 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049229 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049246 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049268 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049286 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049327 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049356 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049372 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049392 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049411 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049429 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049447 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049461 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049475 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049493 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049509 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049542 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049560 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049580 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049596 4762 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049619 4762 reconstruct.go:97] "Volume reconstruction finished" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.049637 4762 reconciler.go:26] "Reconciler: start to sync state" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.063231 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.066255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.066308 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.066321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.067432 4762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.067900 4762 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.067914 4762 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.067973 4762 state_mem.go:36] "Initialized new in-memory state store" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.069495 4762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.069555 4762 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.069591 4762 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.069636 4762 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.070399 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.070514 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.081026 4762 policy_none.go:49] "None policy: Start" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.081909 4762 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.081940 4762 state_mem.go:35] "Initializing new in-memory state store" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.114475 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.142380 4762 manager.go:334] "Starting Device Plugin manager" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.143527 4762 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.143592 4762 server.go:79] "Starting device plugin registration server" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.144326 4762 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.144411 4762 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.144783 4762 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.144945 4762 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.144965 4762 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.152833 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.170006 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.170165 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.172123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.172157 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.172166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.172306 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.172629 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.172718 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173383 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173455 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173910 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.174045 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.173936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.174111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.174434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.174468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.174486 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.174636 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.175026 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.175084 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.175702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.175745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.175753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.175970 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176149 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176185 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176241 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176260 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176268 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.176280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177234 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177308 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177371 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.177325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.178025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.178058 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.178072 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.222383 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="400ms" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.246848 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.249152 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.249200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.249211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.249244 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.249882 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.214:6443: connect: connection refused" node="crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253057 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253109 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253138 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253164 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253289 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253343 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253395 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253447 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253515 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253551 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253579 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253685 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.253722 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.354949 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355022 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355045 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355068 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355094 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355113 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355157 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355186 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355249 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355271 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355285 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355291 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355300 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355356 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355337 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355512 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355609 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355689 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.355809 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356257 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356305 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356191 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.356397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.359124 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.359458 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.450842 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.453443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.453497 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.453509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.453547 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.454284 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.214:6443: connect: connection refused" node="crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.510690 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.530950 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.548044 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-14cf30946a8c21865aca069d22b0c56fa18835d9bce58fdd6e7babfee8f330b1 WatchSource:0}: Error finding container 14cf30946a8c21865aca069d22b0c56fa18835d9bce58fdd6e7babfee8f330b1: Status 404 returned error can't find the container with id 14cf30946a8c21865aca069d22b0c56fa18835d9bce58fdd6e7babfee8f330b1 Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.556683 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-40bc569ff6c15a76ea0d8463ec99454563aff9dce114b564a6c902e69bb8e781 WatchSource:0}: Error finding container 40bc569ff6c15a76ea0d8463ec99454563aff9dce114b564a6c902e69bb8e781: Status 404 returned error can't find the container with id 40bc569ff6c15a76ea0d8463ec99454563aff9dce114b564a6c902e69bb8e781 Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.564321 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.575250 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.580262 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.586864 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-949a88570621417ec9dec5a6ffd5c8f5fe3d68e7da6ac6b39e9666e4164a7ac8 WatchSource:0}: Error finding container 949a88570621417ec9dec5a6ffd5c8f5fe3d68e7da6ac6b39e9666e4164a7ac8: Status 404 returned error can't find the container with id 949a88570621417ec9dec5a6ffd5c8f5fe3d68e7da6ac6b39e9666e4164a7ac8 Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.623993 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="800ms" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.855287 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.856928 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.856975 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.856986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:20 crc kubenswrapper[4762]: I0217 14:05:20.857015 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.857592 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.214:6443: connect: connection refused" node="crc" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.864457 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.864553 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.874599 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.874731 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:20 crc kubenswrapper[4762]: W0217 14:05:20.902028 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:20 crc kubenswrapper[4762]: E0217 14:05:20.902135 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.007744 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.009741 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 17:43:00.387914899 +0000 UTC Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.075333 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"949a88570621417ec9dec5a6ffd5c8f5fe3d68e7da6ac6b39e9666e4164a7ac8"} Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.076621 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"40bc569ff6c15a76ea0d8463ec99454563aff9dce114b564a6c902e69bb8e781"} Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.077649 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"14cf30946a8c21865aca069d22b0c56fa18835d9bce58fdd6e7babfee8f330b1"} Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.078595 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9e79943f6f14dee30a224982afe0ec50c2246bf121a5e28d3b1380a60291143"} Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.079954 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8cf8bcba0b3572369dea2e177149e6183a2678b2246949b8b4e7f1c5864bd8f3"} Feb 17 14:05:21 crc kubenswrapper[4762]: W0217 14:05:21.368631 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:21 crc kubenswrapper[4762]: E0217 14:05:21.368721 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:21 crc kubenswrapper[4762]: E0217 14:05:21.425603 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="1.6s" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.658126 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.660798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.660831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.660840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.660860 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:21 crc kubenswrapper[4762]: E0217 14:05:21.661319 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.214:6443: connect: connection refused" node="crc" Feb 17 14:05:21 crc kubenswrapper[4762]: I0217 14:05:21.946819 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:05:21 crc kubenswrapper[4762]: E0217 14:05:21.947978 4762 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.006903 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.009886 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 06:00:40.859749767 +0000 UTC Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.083967 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92" exitCode=0 Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.084054 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.084090 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.085034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.085070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.085081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.086491 4762 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172" exitCode=0 Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.086555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.086615 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.086684 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.087738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.087767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.087779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.087831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.087864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.087874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.088309 4762 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01" exitCode=0 Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.088372 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.088383 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.089175 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.089213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.089221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.090826 4762 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6" exitCode=0 Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.090882 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.090885 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.091612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.091653 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.091667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.093494 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.093520 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.093530 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.093539 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2"} Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.093541 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.094117 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.094142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:22 crc kubenswrapper[4762]: I0217 14:05:22.094151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:22 crc kubenswrapper[4762]: W0217 14:05:22.708891 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:22 crc kubenswrapper[4762]: E0217 14:05:22.708957 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.006780 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.009996 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:43:23.664982308 +0000 UTC Feb 17 14:05:23 crc kubenswrapper[4762]: E0217 14:05:23.027279 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="3.2s" Feb 17 14:05:23 crc kubenswrapper[4762]: W0217 14:05:23.072809 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:23 crc kubenswrapper[4762]: E0217 14:05:23.072893 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.098231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.098272 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.099181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.099222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.099235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.101565 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.101596 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.101606 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.101617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.101628 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.101734 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.102440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.102463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.102471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.103856 4762 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593" exitCode=0 Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.103901 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.103974 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.104536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.104556 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.104563 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.107137 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.107476 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.107772 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.107796 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.107806 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d"} Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.108090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.108109 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.108116 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.108500 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.108520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.108529 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:23 crc kubenswrapper[4762]: W0217 14:05:23.112528 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:23 crc kubenswrapper[4762]: E0217 14:05:23.112582 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.261449 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.262451 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.262490 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.262503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.262526 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:23 crc kubenswrapper[4762]: E0217 14:05:23.262935 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.214:6443: connect: connection refused" node="crc" Feb 17 14:05:23 crc kubenswrapper[4762]: W0217 14:05:23.334089 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.214:6443: connect: connection refused Feb 17 14:05:23 crc kubenswrapper[4762]: E0217 14:05:23.334447 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.214:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:05:23 crc kubenswrapper[4762]: I0217 14:05:23.548128 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.011099 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 11:02:39.748962793 +0000 UTC Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112543 4762 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f" exitCode=0 Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112636 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112671 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112687 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f"} Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112816 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112880 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.112822 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114047 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114080 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114523 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.114612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.115574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.115604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.115631 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:24 crc kubenswrapper[4762]: I0217 14:05:24.620880 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.011366 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 10:46:07.556102319 +0000 UTC Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119709 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e"} Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119777 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119778 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc"} Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119919 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119979 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119907 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95"} Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.120283 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46"} Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.119855 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.120306 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522"} Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.120488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.120514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.120525 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.121193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.121221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.121230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.121979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.122026 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.122045 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.273282 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.507442 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.507682 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.508804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.508834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:25 crc kubenswrapper[4762]: I0217 14:05:25.508844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.012231 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:41:40.705181991 +0000 UTC Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.121811 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.121893 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.122793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.122824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.122836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.122846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.122865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.122875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.243033 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.343593 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.464088 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.465256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.465299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.465308 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.465338 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.822900 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.823081 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.823995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.824022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:26 crc kubenswrapper[4762]: I0217 14:05:26.824029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.012760 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 02:14:51.865931566 +0000 UTC Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.123737 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.125695 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.125768 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.125792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.179753 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.180074 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.181712 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.181753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:27 crc kubenswrapper[4762]: I0217 14:05:27.181763 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:28 crc kubenswrapper[4762]: I0217 14:05:28.013214 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 19:00:06.527714971 +0000 UTC Feb 17 14:05:28 crc kubenswrapper[4762]: I0217 14:05:28.740432 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:28 crc kubenswrapper[4762]: I0217 14:05:28.740634 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:28 crc kubenswrapper[4762]: I0217 14:05:28.741830 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:28 crc kubenswrapper[4762]: I0217 14:05:28.741887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:28 crc kubenswrapper[4762]: I0217 14:05:28.741900 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.013788 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 13:04:47.766604964 +0000 UTC Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.196963 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.197120 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.198077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.198111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.198119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.823846 4762 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:05:29 crc kubenswrapper[4762]: I0217 14:05:29.823965 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.013872 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 03:08:30.652316614 +0000 UTC Feb 17 14:05:30 crc kubenswrapper[4762]: E0217 14:05:30.153022 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.323920 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.324059 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.325089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.325121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.325132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:30 crc kubenswrapper[4762]: I0217 14:05:30.330391 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:31 crc kubenswrapper[4762]: I0217 14:05:31.014923 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 00:47:50.65662135 +0000 UTC Feb 17 14:05:31 crc kubenswrapper[4762]: I0217 14:05:31.133907 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:31 crc kubenswrapper[4762]: I0217 14:05:31.134960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:31 crc kubenswrapper[4762]: I0217 14:05:31.135001 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:31 crc kubenswrapper[4762]: I0217 14:05:31.135012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:31 crc kubenswrapper[4762]: I0217 14:05:31.137989 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:32 crc kubenswrapper[4762]: I0217 14:05:32.015458 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:07:34.789057481 +0000 UTC Feb 17 14:05:32 crc kubenswrapper[4762]: I0217 14:05:32.136739 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:32 crc kubenswrapper[4762]: I0217 14:05:32.137686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:32 crc kubenswrapper[4762]: I0217 14:05:32.137721 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:32 crc kubenswrapper[4762]: I0217 14:05:32.137732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:33 crc kubenswrapper[4762]: I0217 14:05:33.016230 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 23:16:37.24063619 +0000 UTC Feb 17 14:05:33 crc kubenswrapper[4762]: I0217 14:05:33.036040 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 14:05:33 crc kubenswrapper[4762]: I0217 14:05:33.036109 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 14:05:33 crc kubenswrapper[4762]: I0217 14:05:33.548521 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:05:33 crc kubenswrapper[4762]: I0217 14:05:33.548838 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:05:34 crc kubenswrapper[4762]: I0217 14:05:34.007769 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 14:05:34 crc kubenswrapper[4762]: I0217 14:05:34.017114 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:17:32.485830144 +0000 UTC Feb 17 14:05:34 crc kubenswrapper[4762]: I0217 14:05:34.349693 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 14:05:34 crc kubenswrapper[4762]: I0217 14:05:34.349787 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 14:05:35 crc kubenswrapper[4762]: I0217 14:05:35.017618 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:45:49.798702674 +0000 UTC Feb 17 14:05:36 crc kubenswrapper[4762]: I0217 14:05:36.018046 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 10:12:47.79348079 +0000 UTC Feb 17 14:05:37 crc kubenswrapper[4762]: I0217 14:05:37.019118 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 16:08:44.625424918 +0000 UTC Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.019743 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 07:49:18.766405914 +0000 UTC Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.551824 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.551981 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.552856 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.552886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.552897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:38 crc kubenswrapper[4762]: I0217 14:05:38.555916 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.020003 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 01:28:55.880144439 +0000 UTC Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.152267 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.152321 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.153240 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.153278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.153289 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.256397 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.256791 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.257637 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.257740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.257806 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.269619 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 14:05:39 crc kubenswrapper[4762]: E0217 14:05:39.345093 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.348942 4762 trace.go:236] Trace[1644903706]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:05:27.763) (total time: 11585ms): Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[1644903706]: ---"Objects listed" error: 11585ms (14:05:39.348) Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[1644903706]: [11.58504503s] [11.58504503s] END Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.394611 4762 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.394873 4762 trace.go:236] Trace[174962368]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:05:29.095) (total time: 10299ms): Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[174962368]: ---"Objects listed" error: 10299ms (14:05:39.394) Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[174962368]: [10.299309934s] [10.299309934s] END Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.394910 4762 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:39 crc kubenswrapper[4762]: E0217 14:05:39.396966 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.397531 4762 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.402291 4762 trace.go:236] Trace[2135486324]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:05:27.186) (total time: 12215ms): Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[2135486324]: ---"Objects listed" error: 12215ms (14:05:39.402) Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[2135486324]: [12.215995387s] [12.215995387s] END Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.402313 4762 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.402509 4762 trace.go:236] Trace[2071620868]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:05:26.983) (total time: 12418ms): Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[2071620868]: ---"Objects listed" error: 12418ms (14:05:39.402) Feb 17 14:05:39 crc kubenswrapper[4762]: Trace[2071620868]: [12.418428316s] [12.418428316s] END Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.402533 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.406413 4762 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.421169 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44054->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.421452 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44054->192.168.126.11:17697: read: connection reset by peer" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.422133 4762 csr.go:261] certificate signing request csr-t6ldn is approved, waiting to be issued Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.423153 4762 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.423225 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.632226 4762 csr.go:257] certificate signing request csr-t6ldn is issued Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.824050 4762 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.824105 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:05:39 crc kubenswrapper[4762]: I0217 14:05:39.875445 4762 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 14:05:39 crc kubenswrapper[4762]: W0217 14:05:39.875693 4762 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 14:05:39 crc kubenswrapper[4762]: W0217 14:05:39.875723 4762 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 14:05:39 crc kubenswrapper[4762]: E0217 14:05:39.875629 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.214:58884->38.102.83.214:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18950db9e865404f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:05:20.559218767 +0000 UTC m=+1.139219419,LastTimestamp:2026-02-17 14:05:20.559218767 +0000 UTC m=+1.139219419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:05:39 crc kubenswrapper[4762]: W0217 14:05:39.875760 4762 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 14:05:39 crc kubenswrapper[4762]: W0217 14:05:39.875820 4762 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.006266 4762 apiserver.go:52] "Watching apiserver" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.016794 4762 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.017147 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-s25qb"] Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.017505 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.017586 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.017704 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.017833 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.018026 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.018047 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.018064 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.018275 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.018284 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.018458 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.019203 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.019463 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.019595 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.020444 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 00:00:18.851834407 +0000 UTC Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.020568 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.020634 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.022002 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.022775 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.023022 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.023046 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.023098 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.023165 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.024144 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.038468 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.058284 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.066141 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.080251 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.090166 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.099192 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.107558 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.117024 4762 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.119997 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.127468 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.138454 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.148547 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.155859 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.157975 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a" exitCode=255 Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.158040 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a"} Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.166399 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.174097 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.174217 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.174900 4762 scope.go:117] "RemoveContainer" containerID="104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.176259 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.190226 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209092 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209479 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209568 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209636 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209716 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209788 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.209883 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210595 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210688 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210750 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210836 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210911 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.211158 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.211353 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.211462 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.211808 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.211909 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.211997 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.212104 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.212542 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.212689 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.212818 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.212888 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.212989 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213076 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213154 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213225 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213305 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213396 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213481 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.213878 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214011 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214083 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214185 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214289 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214363 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214428 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214491 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214560 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214627 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214832 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.214965 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.215084 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.215206 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.215290 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.215383 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216423 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216460 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216488 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216513 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216544 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216570 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216596 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216623 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216665 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216691 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216714 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216735 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216754 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216772 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216788 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216804 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216820 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216837 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216854 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216870 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216887 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216902 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216929 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216981 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216997 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217015 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217032 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217049 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217064 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217080 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217099 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217114 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217128 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217145 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217162 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217180 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217196 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217212 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217227 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217243 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217261 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217281 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217299 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217316 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217333 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217348 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217363 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217380 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217395 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217420 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217437 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217455 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217472 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217488 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217504 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217521 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217537 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217554 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217570 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217587 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217604 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217618 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217634 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217677 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.217998 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218018 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218034 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218049 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218069 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218086 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218102 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218117 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218134 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218149 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218163 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218178 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218194 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218210 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218224 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218240 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218257 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218273 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218289 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218307 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218324 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218340 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218355 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218371 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218388 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218403 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218423 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218440 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218526 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218544 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218562 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218580 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218604 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218628 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218675 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218700 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218726 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218757 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218782 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218807 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218825 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218843 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218861 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218877 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218899 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218915 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218931 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218948 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218965 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.218984 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219002 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219023 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219041 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219059 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219077 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219095 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219129 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219144 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219161 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219179 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219196 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219212 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219229 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219245 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219262 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219279 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219295 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219314 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219331 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219347 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219364 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219380 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219398 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219415 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219432 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219449 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219467 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219484 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219502 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219521 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219537 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219554 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219571 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219591 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219631 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219672 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219692 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219720 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219746 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58hgx\" (UniqueName: \"kubernetes.io/projected/cba5d7d1-c9f6-4012-9380-9abc9449564c-kube-api-access-58hgx\") pod \"node-resolver-s25qb\" (UID: \"cba5d7d1-c9f6-4012-9380-9abc9449564c\") " pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219770 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219796 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219932 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219953 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.219980 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cba5d7d1-c9f6-4012-9380-9abc9449564c-hosts-file\") pod \"node-resolver-s25qb\" (UID: \"cba5d7d1-c9f6-4012-9380-9abc9449564c\") " pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220021 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220050 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220852 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210342 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210470 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210517 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210508 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.210553 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.215541 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.215956 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216042 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.216269 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.220149 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.227258 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:40.727231932 +0000 UTC m=+21.307232584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227429 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227466 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227604 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227633 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227673 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227843 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.227877 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.228053 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.228106 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.228212 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.228402 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.228693 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.228890 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.229108 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.229326 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.229379 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.229722 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.229798 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.230074 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.230142 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.230175 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220440 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220462 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220722 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.230260 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220770 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220810 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220863 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220986 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.221003 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.221079 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.221098 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.221902 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.221925 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222044 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222259 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222297 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222465 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222538 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222783 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.222799 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.223105 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.223357 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.223674 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.223730 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.223740 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.224029 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.224108 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.224684 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.224712 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.224798 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.230511 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.230562 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.231135 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.231329 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:05:40.731312413 +0000 UTC m=+21.311313245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.231448 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.231583 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.231801 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.231851 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.231994 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232080 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232100 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232273 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232520 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232534 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232564 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.232834 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233143 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233398 4762 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233481 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233557 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233711 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233367 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.233733 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.234232 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.234328 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.234353 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.234517 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.234753 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.234904 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235015 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235044 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235042 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235259 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235340 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235365 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235463 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235533 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.235762 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.236097 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.236308 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.236314 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.236869 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.237080 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.237260 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.237327 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:40.737305666 +0000 UTC m=+21.317306498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.238107 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.238300 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.238405 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.238058 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.238706 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.238841 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239010 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239174 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239339 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239164 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239567 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239617 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239864 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239895 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.239930 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.240397 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.240554 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.240776 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.241069 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.241153 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.241674 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.241759 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.241773 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.241952 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.242209 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.242564 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.242750 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.243798 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.220241 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.246686 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.252075 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.252116 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.252133 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.252204 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:40.752181131 +0000 UTC m=+21.332181973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.255461 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.255490 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.255504 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.255585 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:40.755540892 +0000 UTC m=+21.335541734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.255880 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.260112 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.260621 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.260873 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.261947 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.262506 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.262587 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.264957 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.265751 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.265973 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.266159 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.266208 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.266295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.266403 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.266428 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.272145 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.272274 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.272590 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.272662 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.272890 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.272900 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.273012 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.274578 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.274723 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.275952 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.276148 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278284 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278439 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278604 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278857 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278978 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.279259 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.279301 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.276172 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.277871 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.277856 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278030 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.278062 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.280000 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.280476 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.280661 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.281204 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.281353 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.281981 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.282288 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.282371 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.280918 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.282438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.282481 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.282510 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283035 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283081 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283163 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283203 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283383 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283623 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.283873 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284060 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284284 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284488 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284512 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284799 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284811 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.284933 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.285075 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.287801 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.290066 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.290203 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.290526 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.290795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.290845 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.297461 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.299733 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.305034 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.311170 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.318853 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322007 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322341 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322376 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58hgx\" (UniqueName: \"kubernetes.io/projected/cba5d7d1-c9f6-4012-9380-9abc9449564c-kube-api-access-58hgx\") pod \"node-resolver-s25qb\" (UID: \"cba5d7d1-c9f6-4012-9380-9abc9449564c\") " pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322452 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cba5d7d1-c9f6-4012-9380-9abc9449564c-hosts-file\") pod \"node-resolver-s25qb\" (UID: \"cba5d7d1-c9f6-4012-9380-9abc9449564c\") " pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322568 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322582 4762 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322620 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322631 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322658 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322669 4762 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322679 4762 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322690 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322700 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322711 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322723 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322734 4762 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322746 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322771 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322782 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322793 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322804 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322814 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322827 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322838 4762 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322849 4762 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322860 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322874 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322871 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322932 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322944 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322955 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.322966 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323007 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323020 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323031 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323037 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323043 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cba5d7d1-c9f6-4012-9380-9abc9449564c-hosts-file\") pod \"node-resolver-s25qb\" (UID: \"cba5d7d1-c9f6-4012-9380-9abc9449564c\") " pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323062 4762 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323075 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323087 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323100 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323111 4762 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323121 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323133 4762 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323143 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323159 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323170 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323180 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323191 4762 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323201 4762 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323219 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323230 4762 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323242 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323257 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323272 4762 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323283 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323293 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323304 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323319 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323330 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323341 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323351 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323361 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323371 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323386 4762 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323396 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323413 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323424 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323453 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323466 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323477 4762 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323487 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323497 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323507 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323521 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323531 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323541 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323552 4762 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323574 4762 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323584 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323594 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323612 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323621 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323631 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323901 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323919 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323938 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323950 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323962 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.323974 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324014 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324034 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324073 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324088 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324099 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324109 4762 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324120 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324130 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324140 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324149 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324159 4762 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324168 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324178 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324191 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324201 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324211 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324222 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324232 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324242 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324252 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324262 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324270 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324279 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324287 4762 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324296 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324304 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324313 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324322 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324335 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324344 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324354 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324364 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324384 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324394 4762 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324405 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324424 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324435 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324448 4762 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324457 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324467 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324477 4762 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324493 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324502 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324512 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324522 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324532 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324542 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324552 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324563 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324580 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324591 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324605 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324615 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324626 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324638 4762 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324673 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324684 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324695 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324710 4762 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324720 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324730 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324743 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324753 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324763 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324776 4762 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324786 4762 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324798 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324808 4762 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324825 4762 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324838 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324849 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324859 4762 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324871 4762 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324881 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324891 4762 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324903 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324913 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324922 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324933 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324941 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324949 4762 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324957 4762 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324965 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.324973 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325009 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325020 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325030 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325041 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325050 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325060 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325070 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325080 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325095 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325105 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325114 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325127 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325139 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325149 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325159 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325170 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325183 4762 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325193 4762 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325203 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325213 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325242 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325256 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325267 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325277 4762 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325287 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.325296 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.330091 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.342416 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.344787 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.344796 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.365832 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.392055 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.392283 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58hgx\" (UniqueName: \"kubernetes.io/projected/cba5d7d1-c9f6-4012-9380-9abc9449564c-kube-api-access-58hgx\") pod \"node-resolver-s25qb\" (UID: \"cba5d7d1-c9f6-4012-9380-9abc9449564c\") " pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.633193 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 14:00:39 +0000 UTC, rotation deadline is 2026-11-23 17:18:45.158280478 +0000 UTC Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.633259 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6699h13m4.525024172s for next certificate rotation Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.651520 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s25qb" Feb 17 14:05:40 crc kubenswrapper[4762]: W0217 14:05:40.661658 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba5d7d1_c9f6_4012_9380_9abc9449564c.slice/crio-d37bd4dff903330a77f8362ecdd5d798fde44616e0846bdb5ea88c00be2622f5 WatchSource:0}: Error finding container d37bd4dff903330a77f8362ecdd5d798fde44616e0846bdb5ea88c00be2622f5: Status 404 returned error can't find the container with id d37bd4dff903330a77f8362ecdd5d798fde44616e0846bdb5ea88c00be2622f5 Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.731385 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.731478 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.731573 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:05:41.73154417 +0000 UTC m=+22.311544822 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.731580 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.731628 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:41.731622172 +0000 UTC m=+22.311622814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.832610 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.832689 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:40 crc kubenswrapper[4762]: I0217 14:05:40.832726 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832804 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832827 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832834 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832848 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832880 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:41.832864416 +0000 UTC m=+22.412865068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832904 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:41.832888096 +0000 UTC m=+22.412888748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832952 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832966 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.832978 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:40 crc kubenswrapper[4762]: E0217 14:05:40.833005 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:41.832997049 +0000 UTC m=+22.412997811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.020679 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 02:26:46.758578791 +0000 UTC Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.070636 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.070789 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.125565 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xpj6v"] Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.126292 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4r7p8"] Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.126516 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.126566 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.128185 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.128467 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.128619 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.128883 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.128888 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.129233 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.133500 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.149787 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.162088 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s25qb" event={"ID":"cba5d7d1-c9f6-4012-9380-9abc9449564c","Type":"ContainerStarted","Data":"8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.162129 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s25qb" event={"ID":"cba5d7d1-c9f6-4012-9380-9abc9449564c","Type":"ContainerStarted","Data":"d37bd4dff903330a77f8362ecdd5d798fde44616e0846bdb5ea88c00be2622f5"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.163496 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4c5cd4d6bf6b2e1efa46b165d239cf7d9c5b95dbcbe3a2ab532d62643e248c6f"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.165166 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.165212 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.165245 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1588c0d34b999321cc11073b72d4e25488e557d58e1c9bd8b1d3ca9f248ab607"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.166978 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.167059 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.167118 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"11829814ff8029505ca063d3c67f15ff74cf092900e9ed39a284901f9d7f0684"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.169275 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.172161 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d"} Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.182213 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.203820 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.214884 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.227489 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-netns\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235522 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-os-release\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235550 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-system-cni-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-cnibin\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235610 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18a966ae-76bd-4298-9964-8be5f5b1dc95-cni-binary-copy\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235625 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlz5\" (UniqueName: \"kubernetes.io/projected/18a966ae-76bd-4298-9964-8be5f5b1dc95-kube-api-access-gqlz5\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235665 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-cni-bin\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235680 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-etc-kubernetes\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235699 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18a966ae-76bd-4298-9964-8be5f5b1dc95-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235720 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-conf-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235784 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-cni-multus\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235827 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-multus-certs\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235849 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-hostroot\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235869 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-kubelet\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235908 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-k8s-cni-cncf-io\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235942 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g987m\" (UniqueName: \"kubernetes.io/projected/c1057884-d2c5-4911-9b97-fb4fedba9ab1-kube-api-access-g987m\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.235995 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-os-release\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.236018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1057884-d2c5-4911-9b97-fb4fedba9ab1-cni-binary-copy\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.236038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-system-cni-dir\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.236082 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-cnibin\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.236104 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-daemon-config\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.236123 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-socket-dir-parent\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.236164 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-cni-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.245449 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.257211 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.268162 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.279990 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.290625 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.304505 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.326917 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337329 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18a966ae-76bd-4298-9964-8be5f5b1dc95-cni-binary-copy\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337394 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-cni-bin\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-etc-kubernetes\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337473 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18a966ae-76bd-4298-9964-8be5f5b1dc95-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlz5\" (UniqueName: \"kubernetes.io/projected/18a966ae-76bd-4298-9964-8be5f5b1dc95-kube-api-access-gqlz5\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-conf-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337585 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337604 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-cni-multus\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337668 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-multus-certs\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337692 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-hostroot\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337715 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-kubelet\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337758 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-k8s-cni-cncf-io\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-os-release\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g987m\" (UniqueName: \"kubernetes.io/projected/c1057884-d2c5-4911-9b97-fb4fedba9ab1-kube-api-access-g987m\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1057884-d2c5-4911-9b97-fb4fedba9ab1-cni-binary-copy\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337943 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-system-cni-dir\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.337964 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-cnibin\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338012 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-socket-dir-parent\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-daemon-config\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338063 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-cni-multus\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338108 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-cni-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338180 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-netns\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338211 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-etc-kubernetes\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338232 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-os-release\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338183 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-cni-bin\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338446 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-cnibin\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338506 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-system-cni-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338601 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-system-cni-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338776 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-k8s-cni-cncf-io\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.338839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-os-release\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339094 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-hostroot\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339134 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-var-lib-kubelet\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339156 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-multus-certs\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339206 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-socket-dir-parent\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339211 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-cnibin\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339266 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-conf-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339440 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-host-run-netns\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339508 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-os-release\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339551 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-daemon-config\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-system-cni-dir\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18a966ae-76bd-4298-9964-8be5f5b1dc95-cni-binary-copy\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339608 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-cnibin\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339609 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1057884-d2c5-4911-9b97-fb4fedba9ab1-multus-cni-dir\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339717 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18a966ae-76bd-4298-9964-8be5f5b1dc95-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.339834 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1057884-d2c5-4911-9b97-fb4fedba9ab1-cni-binary-copy\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.340110 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18a966ae-76bd-4298-9964-8be5f5b1dc95-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.359280 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlz5\" (UniqueName: \"kubernetes.io/projected/18a966ae-76bd-4298-9964-8be5f5b1dc95-kube-api-access-gqlz5\") pod \"multus-additional-cni-plugins-xpj6v\" (UID: \"18a966ae-76bd-4298-9964-8be5f5b1dc95\") " pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.361672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g987m\" (UniqueName: \"kubernetes.io/projected/c1057884-d2c5-4911-9b97-fb4fedba9ab1-kube-api-access-g987m\") pod \"multus-4r7p8\" (UID: \"c1057884-d2c5-4911-9b97-fb4fedba9ab1\") " pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.366888 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.385320 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.396622 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.408754 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.419115 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.430417 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.438326 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.444776 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4r7p8" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.444889 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: W0217 14:05:41.455537 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a966ae_76bd_4298_9964_8be5f5b1dc95.slice/crio-1c2611466480ee3da8ec398763ad027a8abd3b719f262c71a22296fb1f948b46 WatchSource:0}: Error finding container 1c2611466480ee3da8ec398763ad027a8abd3b719f262c71a22296fb1f948b46: Status 404 returned error can't find the container with id 1c2611466480ee3da8ec398763ad027a8abd3b719f262c71a22296fb1f948b46 Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.457794 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.491743 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-rwhnp"] Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.493307 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.493462 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vksr"] Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.494574 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.494991 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.495162 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.495164 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.495561 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.495978 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.496195 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.496946 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.500184 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.500293 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.500454 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.500469 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.500584 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.511210 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.521988 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.537311 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540019 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-netns\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540054 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb11ce5-3ff7-4743-a879-95285dae2998-rootfs\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540078 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-config\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540098 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-env-overrides\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540156 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-script-lib\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540224 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540280 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6hn\" (UniqueName: \"kubernetes.io/projected/3eb11ce5-3ff7-4743-a879-95285dae2998-kube-api-access-nq6hn\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540301 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-systemd-units\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540320 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-log-socket\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-netd\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540357 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5t9\" (UniqueName: \"kubernetes.io/projected/ab134be0-88ef-45ac-80e0-963a60169ad2-kube-api-access-8m5t9\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540376 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-node-log\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540404 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-var-lib-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-slash\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540442 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-bin\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540465 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540487 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-systemd\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540509 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-ovn\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540554 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb11ce5-3ff7-4743-a879-95285dae2998-mcd-auth-proxy-config\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540574 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-kubelet\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-etc-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540624 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab134be0-88ef-45ac-80e0-963a60169ad2-ovn-node-metrics-cert\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.540660 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb11ce5-3ff7-4743-a879-95285dae2998-proxy-tls\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.550577 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.566127 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.583549 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.594179 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.606265 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.619446 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.634152 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641528 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-var-lib-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641599 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-bin\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641622 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641664 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-slash\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641688 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-systemd\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641689 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-var-lib-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641711 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641709 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-ovn\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641747 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-ovn\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641771 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-systemd\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641771 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-slash\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-kubelet\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-etc-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641866 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-kubelet\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641874 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb11ce5-3ff7-4743-a879-95285dae2998-mcd-auth-proxy-config\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641899 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-etc-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641921 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab134be0-88ef-45ac-80e0-963a60169ad2-ovn-node-metrics-cert\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb11ce5-3ff7-4743-a879-95285dae2998-proxy-tls\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641977 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-netns\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb11ce5-3ff7-4743-a879-95285dae2998-rootfs\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642024 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-config\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-env-overrides\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642086 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-script-lib\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642115 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-systemd-units\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642137 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-log-socket\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642156 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6hn\" (UniqueName: \"kubernetes.io/projected/3eb11ce5-3ff7-4743-a879-95285dae2998-kube-api-access-nq6hn\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642204 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5t9\" (UniqueName: \"kubernetes.io/projected/ab134be0-88ef-45ac-80e0-963a60169ad2-kube-api-access-8m5t9\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642224 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-netd\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642245 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-node-log\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642306 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-node-log\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.642897 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-bin\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-netns\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643058 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb11ce5-3ff7-4743-a879-95285dae2998-rootfs\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643068 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.641927 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-openvswitch\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643220 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb11ce5-3ff7-4743-a879-95285dae2998-mcd-auth-proxy-config\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643254 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-systemd-units\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643532 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-log-socket\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643567 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-netd\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.643732 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-config\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.644371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-env-overrides\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.644691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-script-lib\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.647573 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.648472 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab134be0-88ef-45ac-80e0-963a60169ad2-ovn-node-metrics-cert\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.648605 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb11ce5-3ff7-4743-a879-95285dae2998-proxy-tls\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.660499 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5t9\" (UniqueName: \"kubernetes.io/projected/ab134be0-88ef-45ac-80e0-963a60169ad2-kube-api-access-8m5t9\") pod \"ovnkube-node-7vksr\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.661136 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.661313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6hn\" (UniqueName: \"kubernetes.io/projected/3eb11ce5-3ff7-4743-a879-95285dae2998-kube-api-access-nq6hn\") pod \"machine-config-daemon-rwhnp\" (UID: \"3eb11ce5-3ff7-4743-a879-95285dae2998\") " pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.684721 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.699447 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.714236 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.726929 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.738184 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.743202 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.743289 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.743398 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.743446 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:43.743431463 +0000 UTC m=+24.323432115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.743510 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:05:43.743490675 +0000 UTC m=+24.323491327 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.749045 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.758451 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.769033 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.783210 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.796595 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.808868 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.826108 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.827246 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: W0217 14:05:41.838104 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb11ce5_3ff7_4743_a879_95285dae2998.slice/crio-b02fe83ac19bf716d3f11ee55cc0d77a82fb6b3b814320f25d064017ee2fe12d WatchSource:0}: Error finding container b02fe83ac19bf716d3f11ee55cc0d77a82fb6b3b814320f25d064017ee2fe12d: Status 404 returned error can't find the container with id b02fe83ac19bf716d3f11ee55cc0d77a82fb6b3b814320f25d064017ee2fe12d Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.844074 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.844404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.844447 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.844487 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:41 crc kubenswrapper[4762]: I0217 14:05:41.844600 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.844877 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.844936 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:43.844917544 +0000 UTC m=+24.424918206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.844976 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845020 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845033 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845107 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:43.845064818 +0000 UTC m=+24.425065470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845136 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845199 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845225 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:41 crc kubenswrapper[4762]: E0217 14:05:41.845326 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:43.845290504 +0000 UTC m=+24.425291206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:41 crc kubenswrapper[4762]: W0217 14:05:41.862209 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab134be0_88ef_45ac_80e0_963a60169ad2.slice/crio-68b1affc067a8160a4de26baac09a6bc0782eec9060a2a6bcba2732a213a64e4 WatchSource:0}: Error finding container 68b1affc067a8160a4de26baac09a6bc0782eec9060a2a6bcba2732a213a64e4: Status 404 returned error can't find the container with id 68b1affc067a8160a4de26baac09a6bc0782eec9060a2a6bcba2732a213a64e4 Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.020996 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 01:51:08.444458554 +0000 UTC Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.070001 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.070097 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:42 crc kubenswrapper[4762]: E0217 14:05:42.070116 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:42 crc kubenswrapper[4762]: E0217 14:05:42.070265 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.073904 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.074597 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.075357 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.076033 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.076567 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.077064 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.078666 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.079201 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.080256 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.080807 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.081811 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.083205 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.084221 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.084748 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.085604 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.086303 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.086887 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.087287 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.088218 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.088786 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.089571 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.090336 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.090778 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.091813 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.092287 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.093343 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.094107 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.095169 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.095751 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.096620 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.097077 4762 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.097174 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.099186 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.099890 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.100344 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.101806 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.102768 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.103303 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.105059 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.105700 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.106505 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.107085 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.108007 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.108927 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.109358 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.109886 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.110707 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.111403 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.112413 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.112891 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.113707 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.114237 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.114796 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.115589 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.176124 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" exitCode=0 Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.176201 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.176414 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"68b1affc067a8160a4de26baac09a6bc0782eec9060a2a6bcba2732a213a64e4"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.177589 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.177615 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.177627 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"b02fe83ac19bf716d3f11ee55cc0d77a82fb6b3b814320f25d064017ee2fe12d"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.179586 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerStarted","Data":"1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.179616 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerStarted","Data":"403efe5bf756c2698e1e5a3d18e9605dda756172b326baa450f834e0e15cc195"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.182030 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a966ae-76bd-4298-9964-8be5f5b1dc95" containerID="691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e" exitCode=0 Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.182126 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerDied","Data":"691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.182180 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerStarted","Data":"1c2611466480ee3da8ec398763ad027a8abd3b719f262c71a22296fb1f948b46"} Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.182699 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.195284 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.214188 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.228558 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.243669 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.270600 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.283944 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.297338 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.312969 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.322482 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.336749 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.355285 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.369845 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.386675 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.403872 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.414678 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.428582 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.464954 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.519170 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.538030 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.586564 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.615935 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.660256 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.697871 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.734336 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.780127 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:42 crc kubenswrapper[4762]: I0217 14:05:42.817084 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.000493 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-76htw"] Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.000921 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.003047 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.003203 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.003231 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.003290 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.013841 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.021276 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:28:55.558514078 +0000 UTC Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.024465 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.041389 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.056189 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.058552 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5l6\" (UniqueName: \"kubernetes.io/projected/1a3db634-a0f8-46b2-b54f-a12a054aa004-kube-api-access-pw5l6\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.058596 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a3db634-a0f8-46b2-b54f-a12a054aa004-host\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.058694 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a3db634-a0f8-46b2-b54f-a12a054aa004-serviceca\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.069939 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.070095 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.100121 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.142111 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.160003 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a3db634-a0f8-46b2-b54f-a12a054aa004-host\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.160293 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a3db634-a0f8-46b2-b54f-a12a054aa004-serviceca\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.160331 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5l6\" (UniqueName: \"kubernetes.io/projected/1a3db634-a0f8-46b2-b54f-a12a054aa004-kube-api-access-pw5l6\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.160136 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a3db634-a0f8-46b2-b54f-a12a054aa004-host\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.161462 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a3db634-a0f8-46b2-b54f-a12a054aa004-serviceca\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.176081 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.186866 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.190186 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.190210 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.190219 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.190229 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.190238 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.190269 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.192028 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a966ae-76bd-4298-9964-8be5f5b1dc95" containerID="3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee" exitCode=0 Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.192172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerDied","Data":"3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee"} Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.206911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5l6\" (UniqueName: \"kubernetes.io/projected/1a3db634-a0f8-46b2-b54f-a12a054aa004-kube-api-access-pw5l6\") pod \"node-ca-76htw\" (UID: \"1a3db634-a0f8-46b2-b54f-a12a054aa004\") " pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.238526 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.275365 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.316571 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.357483 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.396321 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.434759 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.479600 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.516128 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.534008 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-76htw" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.555234 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: W0217 14:05:43.556615 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a3db634_a0f8_46b2_b54f_a12a054aa004.slice/crio-8df9f4d22619cd75d519ce2c7bfed23385a4f6fd4c6b5c84ebc0e920b200da81 WatchSource:0}: Error finding container 8df9f4d22619cd75d519ce2c7bfed23385a4f6fd4c6b5c84ebc0e920b200da81: Status 404 returned error can't find the container with id 8df9f4d22619cd75d519ce2c7bfed23385a4f6fd4c6b5c84ebc0e920b200da81 Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.597990 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.640355 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.677171 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.715880 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.756253 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.765969 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.766112 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:05:47.766083859 +0000 UTC m=+28.346084561 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.766170 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.766326 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.766396 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:47.766377376 +0000 UTC m=+28.346378028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.799371 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.836294 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.867444 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.867486 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.867513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867633 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867674 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867689 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867730 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:47.867717523 +0000 UTC m=+28.447718175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867632 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867766 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:47.867760524 +0000 UTC m=+28.447761176 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867825 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867874 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867894 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:43 crc kubenswrapper[4762]: E0217 14:05:43.867980 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:47.867955049 +0000 UTC m=+28.447955741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.877363 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.916921 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:43 crc kubenswrapper[4762]: I0217 14:05:43.966413 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.003096 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.021965 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:55:13.768558213 +0000 UTC Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.038947 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.070387 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.070393 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:44 crc kubenswrapper[4762]: E0217 14:05:44.070527 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:44 crc kubenswrapper[4762]: E0217 14:05:44.070603 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.196602 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a966ae-76bd-4298-9964-8be5f5b1dc95" containerID="5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316" exitCode=0 Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.196694 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerDied","Data":"5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316"} Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.197588 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-76htw" event={"ID":"1a3db634-a0f8-46b2-b54f-a12a054aa004","Type":"ContainerStarted","Data":"5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565"} Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.197625 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-76htw" event={"ID":"1a3db634-a0f8-46b2-b54f-a12a054aa004","Type":"ContainerStarted","Data":"8df9f4d22619cd75d519ce2c7bfed23385a4f6fd4c6b5c84ebc0e920b200da81"} Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.212396 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.224041 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.244360 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.255531 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.268317 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.287357 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.315870 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.356599 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.396159 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.436627 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.474511 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.517582 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.559269 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.599447 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.636495 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.677660 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.725802 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.759137 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.794466 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.836196 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.876774 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.920837 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:44 crc kubenswrapper[4762]: I0217 14:05:44.957365 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.006514 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.022163 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 14:56:46.404050479 +0000 UTC Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.041535 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.070777 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.071032 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.077070 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.118407 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.157746 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.204821 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.207557 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a966ae-76bd-4298-9964-8be5f5b1dc95" containerID="1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564" exitCode=0 Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.207612 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerDied","Data":"1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564"} Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.223186 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.238067 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.276121 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.315279 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.357259 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.399914 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.437940 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.480079 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.521965 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.557914 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.595844 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.645832 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.677139 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.720971 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.797606 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.799458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.799496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.799507 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.799605 4762 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.805487 4762 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.805774 4762 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.806603 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.806633 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.806660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.806675 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.806686 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.822728 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.826016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.826041 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.826049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.826061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.826069 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.837799 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.840756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.840775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.840784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.840796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.840804 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.853403 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.856600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.856738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.856816 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.856889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.856964 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.869111 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.873023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.873149 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.873230 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.873332 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.873419 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.886893 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:45 crc kubenswrapper[4762]: E0217 14:05:45.887192 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.888364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.888459 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.888535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.888622 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.888701 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.990225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.990458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.990524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.990585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:45 crc kubenswrapper[4762]: I0217 14:05:45.990653 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:45Z","lastTransitionTime":"2026-02-17T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.022739 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 20:04:45.397276188 +0000 UTC Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.070105 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.070149 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:46 crc kubenswrapper[4762]: E0217 14:05:46.070261 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:46 crc kubenswrapper[4762]: E0217 14:05:46.070369 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.092700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.092730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.092738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.092752 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.092762 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.195597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.195664 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.195675 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.195693 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.195705 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.213116 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a966ae-76bd-4298-9964-8be5f5b1dc95" containerID="09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed" exitCode=0 Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.213170 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerDied","Data":"09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.226204 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.239347 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.253571 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.266872 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.283316 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.296086 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.298096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.298134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.298145 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.298160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.298169 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.307770 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.325746 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.339536 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.351829 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.362596 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.380825 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.393779 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.399969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.399996 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.400005 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.400019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.400030 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.406406 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.502634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.502688 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.502700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.502716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.502726 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.604712 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.604760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.604771 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.604786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.604796 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.707042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.707269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.707349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.707434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.707545 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.810186 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.810260 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.810284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.810315 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.810341 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.828681 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.834070 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.839286 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.850084 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.863997 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.878990 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.895197 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.906699 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.912557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.912868 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.912964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.913065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.913151 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:46Z","lastTransitionTime":"2026-02-17T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.922085 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.935329 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.949065 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.960518 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.969820 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.985399 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:46 crc kubenswrapper[4762]: I0217 14:05:46.996799 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.007772 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.014894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.014936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.014949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.014967 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.014976 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.018291 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.022893 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 16:23:02.389563044 +0000 UTC Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.030409 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.040970 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.052052 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.070472 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.070817 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.073581 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.096624 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.117206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.117452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.117556 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.117665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.117741 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.139050 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.178067 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.216777 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.219110 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.219168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.219422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.219449 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.219459 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.220057 4762 generic.go:334] "Generic (PLEG): container finished" podID="18a966ae-76bd-4298-9964-8be5f5b1dc95" containerID="4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530" exitCode=0 Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.220123 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerDied","Data":"4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.224088 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.224418 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.249735 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.252451 4762 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.276193 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.314708 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.320987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.321021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.321029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.321042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.321050 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.389439 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.408425 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.423653 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.423689 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.423698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.423714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.423725 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.436603 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.475742 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.520845 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.526011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.526033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.526043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.526056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.526066 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.556796 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.595062 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.628693 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.628757 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.628775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.628801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.628819 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.640328 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.693585 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.716249 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.730414 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.730442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.730450 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.730463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.730472 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.759577 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.799179 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.805473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.805592 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.805609 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:05:55.805585862 +0000 UTC m=+36.385586514 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.805695 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.805767 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:55.805756357 +0000 UTC m=+36.385757009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.832333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.832382 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.832393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.832406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.832414 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.836805 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.878149 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.906586 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.906657 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.906695 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906752 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906778 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906789 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906823 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906838 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:55.906822786 +0000 UTC m=+36.486823438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906870 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:55.906857897 +0000 UTC m=+36.486858539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906880 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906934 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.906949 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:47 crc kubenswrapper[4762]: E0217 14:05:47.907025 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:05:55.906999021 +0000 UTC m=+36.486999723 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.914849 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.934281 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.934335 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.934346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.934363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.934375 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:47Z","lastTransitionTime":"2026-02-17T14:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.957103 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:47 crc kubenswrapper[4762]: I0217 14:05:47.996872 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.023619 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 21:21:59.521753245 +0000 UTC Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.037216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.037252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.037262 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.037278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.037289 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.037955 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.070357 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.070364 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:48 crc kubenswrapper[4762]: E0217 14:05:48.070541 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:48 crc kubenswrapper[4762]: E0217 14:05:48.070625 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.076671 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.121624 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.139084 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.139119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.139129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.139146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.139158 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.233540 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" event={"ID":"18a966ae-76bd-4298-9964-8be5f5b1dc95","Type":"ContainerStarted","Data":"3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.233613 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.234360 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.241241 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.241273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.241283 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.241298 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.241311 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.250917 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.259690 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.261698 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.274745 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.297603 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.317719 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.343355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.343391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.343401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.343416 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.343428 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.357069 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.398747 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.437830 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.445740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.445775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.445782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.445795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.445804 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.475382 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.513596 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.547994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.548034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.548043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.548058 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.548068 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.557754 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.598019 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.639341 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.650297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.650325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.650333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.650346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.650354 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.674986 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.721786 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.752660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.752739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.752754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.752771 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.752781 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.756537 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.795769 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.836690 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.854340 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.854389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.854400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.854444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.854459 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.875346 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.914003 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.956509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.956592 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.956606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.956632 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.956661 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:48Z","lastTransitionTime":"2026-02-17T14:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.957541 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:48 crc kubenswrapper[4762]: I0217 14:05:48.996999 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.009375 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.023937 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 10:26:15.663926528 +0000 UTC Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.055894 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.061838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.061882 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.061892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.061907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.061916 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.069778 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:49 crc kubenswrapper[4762]: E0217 14:05:49.069894 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.110718 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.137667 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.163576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.163617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.163628 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.163665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.163680 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.176182 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.216602 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.236737 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.264281 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.265307 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.265341 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.265352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.265369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.265381 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.295363 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.338309 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.368398 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.368452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.368461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.368477 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.368488 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.470136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.470191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.470202 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.470220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.470233 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.572998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.573080 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.573091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.573109 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.573118 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.675382 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.675416 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.675447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.675463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.675477 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.777314 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.777352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.777363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.777378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.777388 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.881435 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.881853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.881866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.881885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.881899 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.984455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.984493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.984506 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.984525 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:49 crc kubenswrapper[4762]: I0217 14:05:49.984537 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:49Z","lastTransitionTime":"2026-02-17T14:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.024120 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 23:52:30.613422276 +0000 UTC Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.070815 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.070835 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:50 crc kubenswrapper[4762]: E0217 14:05:50.071547 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:50 crc kubenswrapper[4762]: E0217 14:05:50.071758 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.085009 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.086979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.087044 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.087053 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.087093 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.087106 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.096667 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.108582 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.119038 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.128982 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.140628 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.157243 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.168515 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.184694 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.190824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.190855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.190865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.190881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.190892 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.196099 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.206573 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.217537 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.233710 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.243493 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/0.log" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.244338 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.246539 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e" exitCode=1 Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.246576 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.247318 4762 scope.go:117] "RemoveContainer" containerID="8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.257997 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.269655 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.282747 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.294181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.294214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.294222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.294236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.294247 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.295592 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.305733 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.315671 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.325388 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.339263 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.349915 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.365917 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:50Z\\\",\\\"message\\\":\\\"49 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:05:50.021511 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:05:50.021523 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:05:50.021626 6073 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:05:50.021666 6073 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:05:50.021669 6073 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:05:50.021692 6073 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:05:50.021702 6073 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:05:50.022349 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:05:50.022365 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:05:50.022396 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:05:50.022418 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:05:50.022400 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:05:50.022437 6073 factory.go:656] Stopping watch factory\\\\nI0217 14:05:50.022448 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.382695 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.393720 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.396144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.396168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.396176 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.396191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.396199 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.417071 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.460814 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.496620 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.498437 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.498474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.498488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.498505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.498516 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.546838 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.605324 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.605364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.605375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.605392 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.605404 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.707827 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.707879 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.707897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.707920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.707936 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.810526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.810564 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.810573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.810587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.810596 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.843840 4762 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.912759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.912788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.912797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.912810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:50 crc kubenswrapper[4762]: I0217 14:05:50.912820 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:50Z","lastTransitionTime":"2026-02-17T14:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.015310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.015352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.015363 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.015378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.015388 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.024471 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:50:55.44614021 +0000 UTC Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.070087 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:51 crc kubenswrapper[4762]: E0217 14:05:51.070232 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.117562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.117600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.117610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.117627 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.117652 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.220112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.220153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.220164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.220178 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.220188 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.251112 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/0.log" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.253893 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.253973 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.264786 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.275691 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.286563 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.298415 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.310726 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.322486 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.322514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.322522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.322535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.322544 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.324711 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.341046 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:50Z\\\",\\\"message\\\":\\\"49 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:05:50.021511 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:05:50.021523 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:05:50.021626 6073 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:05:50.021666 6073 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:05:50.021669 6073 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:05:50.021692 6073 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:05:50.021702 6073 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:05:50.022349 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:05:50.022365 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:05:50.022396 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:05:50.022418 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:05:50.022400 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:05:50.022437 6073 factory.go:656] Stopping watch factory\\\\nI0217 14:05:50.022448 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.353769 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.362975 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.372691 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.381881 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.392051 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.407826 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.418095 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.424378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.424400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.424408 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.424420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.424428 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.436869 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.527405 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.527449 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.527462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.527478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.527487 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.571240 4762 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.630079 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.630126 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.630137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.630151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.630162 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.732755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.732794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.732804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.732820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.732833 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.836921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.836991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.837017 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.837048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.837071 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.940086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.940665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.940740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.940807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:51 crc kubenswrapper[4762]: I0217 14:05:51.940873 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:51Z","lastTransitionTime":"2026-02-17T14:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.025347 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 19:42:07.763889547 +0000 UTC Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.043487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.043577 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.043594 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.043617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.043635 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.070031 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.070070 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:52 crc kubenswrapper[4762]: E0217 14:05:52.070282 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:52 crc kubenswrapper[4762]: E0217 14:05:52.070440 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.146478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.146547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.146560 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.146576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.146587 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.249572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.249639 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.249684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.249701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.249712 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.259846 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/1.log" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.260670 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/0.log" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.263037 4762 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.263326 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc" exitCode=1 Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.263369 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.263406 4762 scope.go:117] "RemoveContainer" containerID="8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.264388 4762 scope.go:117] "RemoveContainer" containerID="486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc" Feb 17 14:05:52 crc kubenswrapper[4762]: E0217 14:05:52.264623 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.280016 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.300766 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:50Z\\\",\\\"message\\\":\\\"49 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:05:50.021511 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:05:50.021523 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:05:50.021626 6073 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:05:50.021666 6073 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:05:50.021669 6073 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:05:50.021692 6073 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:05:50.021702 6073 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:05:50.022349 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:05:50.022365 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:05:50.022396 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:05:50.022418 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:05:50.022400 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:05:50.022437 6073 factory.go:656] Stopping watch factory\\\\nI0217 14:05:50.022448 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.316455 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.330636 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.342054 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.351767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.351815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.351826 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.351843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.351854 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.368766 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.381257 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.397445 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.411747 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.424523 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.435571 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.450246 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.453763 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.453789 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.453797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.453809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.453818 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.464854 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.479631 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.492068 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.557104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.557134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.557141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.557154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.557162 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.659351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.659398 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.659415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.659438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.659454 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.762397 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.762462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.762477 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.762493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.762503 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.865520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.865818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.865894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.865964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.866029 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.969730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.970175 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.970312 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.970418 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:52 crc kubenswrapper[4762]: I0217 14:05:52.970543 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:52Z","lastTransitionTime":"2026-02-17T14:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.025547 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 02:27:57.752224408 +0000 UTC Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.070015 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:53 crc kubenswrapper[4762]: E0217 14:05:53.070235 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.073209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.073421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.073506 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.073598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.073622 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.177187 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.177238 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.177255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.177277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.177293 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.227315 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d"] Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.228000 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.231496 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.233000 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.255972 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.268080 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/1.log" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.271786 4762 scope.go:117] "RemoveContainer" containerID="486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc" Feb 17 14:05:53 crc kubenswrapper[4762]: E0217 14:05:53.272147 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.279907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.279959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.279969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.279984 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.280017 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.298074 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0f407725e5b426b5c700b857961c3df3a2925d01f128bb62392f6715582a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:50Z\\\",\\\"message\\\":\\\"49 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 14:05:50.021511 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:05:50.021523 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:05:50.021626 6073 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:05:50.021666 6073 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:05:50.021669 6073 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:05:50.021692 6073 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:05:50.021702 6073 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:05:50.022349 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 14:05:50.022365 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:05:50.022396 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 14:05:50.022418 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:05:50.022400 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:05:50.022437 6073 factory.go:656] Stopping watch factory\\\\nI0217 14:05:50.022448 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0217 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.320490 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.335251 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.348160 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.359401 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.362864 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.362991 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.363064 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.363197 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdfh\" (UniqueName: \"kubernetes.io/projected/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-kube-api-access-qvdfh\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.381319 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.382733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.382761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.382772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.382784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.382794 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.394527 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.414423 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.430423 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.441111 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.451443 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.463981 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.464117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.464416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.464498 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdfh\" (UniqueName: \"kubernetes.io/projected/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-kube-api-access-qvdfh\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.464547 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.464933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.465195 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.472230 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.478278 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.480983 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdfh\" (UniqueName: \"kubernetes.io/projected/22fa85ee-f73c-44a4-97e9-660bdf0a07f6-kube-api-access-qvdfh\") pod \"ovnkube-control-plane-749d76644c-dw82d\" (UID: \"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.485292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.485443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.485524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.485623 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.485732 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.493313 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.506078 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.520219 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.530544 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.547892 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.551625 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.559394 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: W0217 14:05:53.563017 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22fa85ee_f73c_44a4_97e9_660bdf0a07f6.slice/crio-2f379f1805219e08a4b7d1f9c10d757899520406c95bb2f6d4d75ca4046ea120 WatchSource:0}: Error finding container 2f379f1805219e08a4b7d1f9c10d757899520406c95bb2f6d4d75ca4046ea120: Status 404 returned error can't find the container with id 2f379f1805219e08a4b7d1f9c10d757899520406c95bb2f6d4d75ca4046ea120 Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.571773 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.583013 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.589815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.589853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.589864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.589878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.589886 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.599169 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.617524 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.629633 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.644014 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.656971 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.667787 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.678190 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.687996 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.692765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.692807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.692821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.692836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.692847 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.698595 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.710492 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.797802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.797837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.797846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.797885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.797894 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.899775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.899815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.899824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.899838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:53 crc kubenswrapper[4762]: I0217 14:05:53.899847 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:53Z","lastTransitionTime":"2026-02-17T14:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.002593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.002626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.002634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.002671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.002681 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.025951 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 07:45:46.242183299 +0000 UTC Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.070948 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.071001 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:54 crc kubenswrapper[4762]: E0217 14:05:54.071134 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:54 crc kubenswrapper[4762]: E0217 14:05:54.071218 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.105401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.105447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.105460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.105478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.105494 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.207938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.207986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.207998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.208016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.208029 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.276279 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" event={"ID":"22fa85ee-f73c-44a4-97e9-660bdf0a07f6","Type":"ContainerStarted","Data":"553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.276319 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" event={"ID":"22fa85ee-f73c-44a4-97e9-660bdf0a07f6","Type":"ContainerStarted","Data":"d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.276329 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" event={"ID":"22fa85ee-f73c-44a4-97e9-660bdf0a07f6","Type":"ContainerStarted","Data":"2f379f1805219e08a4b7d1f9c10d757899520406c95bb2f6d4d75ca4046ea120"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.291878 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.304146 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.310376 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.310415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.310424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.310438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.310449 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.327559 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.336358 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7v8bf"] Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.336857 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:54 crc kubenswrapper[4762]: E0217 14:05:54.336936 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.342594 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.353749 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.365765 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.379328 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.401251 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.412687 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.412736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.412748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.412764 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.412776 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.415183 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.428885 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.443823 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.455267 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.475282 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.475333 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2nw\" (UniqueName: \"kubernetes.io/projected/63580a98-4d0e-434e-ad09-e7d542e7a5cc-kube-api-access-lr2nw\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.476029 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.486174 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.497352 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.510057 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.514728 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.514769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.514781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.514797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.514808 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.523905 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.537247 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.556221 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.567582 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.576356 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.576402 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2nw\" (UniqueName: \"kubernetes.io/projected/63580a98-4d0e-434e-ad09-e7d542e7a5cc-kube-api-access-lr2nw\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:54 crc kubenswrapper[4762]: E0217 14:05:54.576534 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:54 crc kubenswrapper[4762]: E0217 14:05:54.576630 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:05:55.076607633 +0000 UTC m=+35.656608285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.579148 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.592119 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.592579 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2nw\" (UniqueName: \"kubernetes.io/projected/63580a98-4d0e-434e-ad09-e7d542e7a5cc-kube-api-access-lr2nw\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.602251 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.615497 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.616795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.616825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.616833 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.616855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.616865 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.632904 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.646876 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.657094 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.669777 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.681274 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.697989 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.715294 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.719822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.719866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.719881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.719898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.719910 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.726218 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.740029 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.822629 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.823133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.823244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.823369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.823466 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.926177 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.926218 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.926265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.926284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:54 crc kubenswrapper[4762]: I0217 14:05:54.926296 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:54Z","lastTransitionTime":"2026-02-17T14:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.026213 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 20:57:55.060877132 +0000 UTC Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.028710 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.028754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.028763 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.028780 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.028789 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.069756 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.069868 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.081423 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.081598 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.081697 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:05:56.08167829 +0000 UTC m=+36.661678942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.130596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.130637 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.130661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.130676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.130689 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.233091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.233160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.233182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.233209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.233226 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.278302 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.291240 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.309704 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.319355 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.332472 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.334785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.334817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.334828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.334842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.334853 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.345056 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.355144 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.369157 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.380499 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.391294 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.401372 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.412806 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.424925 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.436572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.436608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.436617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.436630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.436660 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.443631 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.459932 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.471712 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.481142 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.490869 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.539152 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.539201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.539212 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.539228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.539240 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.641660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.641708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.641720 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.641737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.641748 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.744078 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.744137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.744150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.744167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.744179 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.846324 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.846367 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.846383 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.846406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.846418 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.888263 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.888738 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.888862 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.891718 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:11.891688343 +0000 UTC m=+52.471688995 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.891753 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:11.891743385 +0000 UTC m=+52.471744037 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.948471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.948505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.948515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.948531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.948542 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:55Z","lastTransitionTime":"2026-02-17T14:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.993139 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.993211 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:55 crc kubenswrapper[4762]: I0217 14:05:55.993259 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993344 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993346 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993377 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993392 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993401 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:11.993384549 +0000 UTC m=+52.573385201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993426 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:11.99341503 +0000 UTC m=+52.573415682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993461 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993493 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993507 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:55 crc kubenswrapper[4762]: E0217 14:05:55.993567 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:11.993552764 +0000 UTC m=+52.573553416 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.026936 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 17:43:33.162842764 +0000 UTC Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.051385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.051422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.051430 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.051445 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.051453 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.070699 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.070835 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.070727 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.070712 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.070917 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.071131 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.094382 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.094550 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.094611 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:05:58.094595822 +0000 UTC m=+38.674596474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.153543 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.153581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.153590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.153620 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.153631 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.181262 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.181293 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.181303 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.181317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.181326 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.196193 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.200341 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.200387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.200402 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.200424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.200439 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.215467 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.219031 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.219065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.219077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.219095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.219112 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.230486 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.234191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.234224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.234232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.234246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.234255 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.244636 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.247982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.248050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.248065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.248081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.248092 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.260156 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 14:05:56 crc kubenswrapper[4762]: E0217 14:05:56.260270 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.261629 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.261719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.261730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.261744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.261754 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.363776 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.363821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.363834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.363848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.363861 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.466854 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.466885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.466894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.466907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.466917 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.569380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.569429 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.569440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.569456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.569466 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.671725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.671773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.671781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.671794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.671804 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.774572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.774606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.774615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.774630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.774652 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.878048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.878131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.878164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.878197 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.878220 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.981073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.981129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.981141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.981159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:56 crc kubenswrapper[4762]: I0217 14:05:56.981171 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:56Z","lastTransitionTime":"2026-02-17T14:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.028000 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 12:49:30.415816044 +0000 UTC Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.070730 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:57 crc kubenswrapper[4762]: E0217 14:05:57.070916 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.084874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.085235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.085249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.085267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.085280 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.187167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.187199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.187206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.187219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.187227 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.289420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.289455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.289465 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.289480 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.289492 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.394718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.394775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.394789 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.394808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.394819 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.498514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.498569 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.498586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.498605 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.498620 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.601193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.601251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.601268 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.601291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.601307 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.704097 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.704349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.704457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.704540 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.704693 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.807246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.807291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.807306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.807328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.807346 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.910096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.910142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.910155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.910173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:57 crc kubenswrapper[4762]: I0217 14:05:57.910186 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:57Z","lastTransitionTime":"2026-02-17T14:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.013454 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.013495 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.013505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.013524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.013538 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.028749 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 10:26:01.83047303 +0000 UTC Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.070466 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:05:58 crc kubenswrapper[4762]: E0217 14:05:58.070601 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.070487 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:05:58 crc kubenswrapper[4762]: E0217 14:05:58.070693 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.070954 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:58 crc kubenswrapper[4762]: E0217 14:05:58.071143 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.113684 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:05:58 crc kubenswrapper[4762]: E0217 14:05:58.113832 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:58 crc kubenswrapper[4762]: E0217 14:05:58.113881 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:06:02.113866205 +0000 UTC m=+42.693866857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.115270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.115303 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.115314 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.115331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.115342 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.217987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.218017 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.218026 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.218039 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.218049 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.320447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.320790 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.320861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.320881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.320894 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.426474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.426520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.426532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.426551 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.426565 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.530016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.530059 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.530067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.530082 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.530091 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.631935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.632003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.632027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.632053 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.632067 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.734810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.734852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.734863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.734881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.734892 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.837002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.837067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.837081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.837099 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.837115 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.940247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.940314 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.940328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.940349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:58 crc kubenswrapper[4762]: I0217 14:05:58.940360 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:58Z","lastTransitionTime":"2026-02-17T14:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.029517 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 06:54:10.270338144 +0000 UTC Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.042829 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.043225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.043682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.044002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.044208 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.070319 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:05:59 crc kubenswrapper[4762]: E0217 14:05:59.070701 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.146281 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.146320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.146328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.146341 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.146351 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.248167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.248217 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.248228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.248246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.248257 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.351096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.351139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.351150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.351165 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.351176 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.453998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.454029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.454038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.454052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.454060 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.556209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.556247 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.556257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.556271 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.556280 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.659098 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.659126 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.659133 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.659148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.659180 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.761937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.762034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.762047 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.762065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.762077 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.864947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.865213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.865433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.865593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.865726 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.968236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.968300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.968321 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.968350 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:05:59 crc kubenswrapper[4762]: I0217 14:05:59.968368 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:05:59Z","lastTransitionTime":"2026-02-17T14:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.029808 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 09:53:02.213727439 +0000 UTC Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.070125 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.070163 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.070135 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:00 crc kubenswrapper[4762]: E0217 14:06:00.070328 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:00 crc kubenswrapper[4762]: E0217 14:06:00.070427 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:00 crc kubenswrapper[4762]: E0217 14:06:00.070523 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.073283 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.073366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.073391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.073612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.073631 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.085268 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.097714 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.108250 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.121627 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.132709 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.143500 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.154855 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.175813 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.176406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.176454 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.176465 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.176481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.176493 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.186722 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.211210 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.223257 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.232607 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.244369 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.259935 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.272768 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.277929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.278131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.278201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.278330 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.278428 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.289469 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.299668 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.381609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.381669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.381680 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.381724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.381746 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.484467 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.484513 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.484532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.484551 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.484562 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.586581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.586619 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.586627 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.586676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.586689 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.688530 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.688593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.688636 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.688671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.688681 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.790974 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.791030 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.791050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.791068 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.791079 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.894376 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.894428 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.894438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.894459 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.894470 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.997194 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.997245 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.997254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.997267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:00 crc kubenswrapper[4762]: I0217 14:06:00.997277 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:00Z","lastTransitionTime":"2026-02-17T14:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.030550 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 09:30:47.335335119 +0000 UTC Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.070217 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:01 crc kubenswrapper[4762]: E0217 14:06:01.070376 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.100053 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.100118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.100128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.100142 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.100151 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.202718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.202757 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.202767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.202781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.202790 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.305277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.305341 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.305366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.305468 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.305498 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.408254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.408292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.408300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.408313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.408323 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.511443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.511498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.511511 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.511530 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.511545 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.613680 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.613716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.613725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.613743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.613752 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.716037 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.716066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.716076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.716089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.716099 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.818676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.818719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.818734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.818752 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.818764 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.920813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.920863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.920871 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.920884 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:01 crc kubenswrapper[4762]: I0217 14:06:01.920911 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:01Z","lastTransitionTime":"2026-02-17T14:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.023593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.023673 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.023686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.023706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.023723 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.031020 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 03:14:08.329003834 +0000 UTC Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.070859 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.070896 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.070930 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:02 crc kubenswrapper[4762]: E0217 14:06:02.071082 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:02 crc kubenswrapper[4762]: E0217 14:06:02.071157 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:02 crc kubenswrapper[4762]: E0217 14:06:02.071228 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.126258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.126331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.126350 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.126376 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.126396 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.162334 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:02 crc kubenswrapper[4762]: E0217 14:06:02.162569 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:02 crc kubenswrapper[4762]: E0217 14:06:02.162683 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:06:10.162662324 +0000 UTC m=+50.742662976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.229160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.229210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.229221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.229239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.229252 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.331677 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.331720 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.331732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.331748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.331758 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.434396 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.434438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.434447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.434461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.434470 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.537404 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.537444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.537455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.537472 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.537492 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.640347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.640378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.640387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.640417 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.640426 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.742748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.742802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.742817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.742843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.742859 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.844830 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.844870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.844878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.844894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.844903 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.947212 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.947252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.947262 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.947278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:02 crc kubenswrapper[4762]: I0217 14:06:02.947289 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:02Z","lastTransitionTime":"2026-02-17T14:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.032176 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 16:30:22.605195592 +0000 UTC Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.049966 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.050015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.050027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.050044 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.050057 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.070184 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:03 crc kubenswrapper[4762]: E0217 14:06:03.070306 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.152618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.152677 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.152702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.152717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.152726 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.255764 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.255809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.255821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.255837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.255848 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.357473 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.357518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.357533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.357556 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.357569 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.460038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.460104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.460128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.460159 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.460180 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.563451 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.563509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.563533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.563561 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.563582 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.666347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.666386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.666395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.666410 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.666420 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.768858 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.768929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.768939 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.768955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.768966 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.871936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.871989 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.872002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.872021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.872033 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.974097 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.974148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.974164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.974186 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:03 crc kubenswrapper[4762]: I0217 14:06:03.974197 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:03Z","lastTransitionTime":"2026-02-17T14:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.033201 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 09:31:06.850363147 +0000 UTC Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.070688 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.070745 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.070873 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:04 crc kubenswrapper[4762]: E0217 14:06:04.070993 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:04 crc kubenswrapper[4762]: E0217 14:06:04.071165 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:04 crc kubenswrapper[4762]: E0217 14:06:04.071242 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.076225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.076249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.076257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.076270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.076278 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.178716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.178773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.178782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.178795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.178803 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.280914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.280958 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.280969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.280987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.281002 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.297311 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.298119 4762 scope.go:117] "RemoveContainer" containerID="486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.383352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.383387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.383399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.383437 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.383451 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.485095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.485188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.485198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.485211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.485220 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.587347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.587396 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.587408 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.587427 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.587440 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.689697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.689736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.689746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.689761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.689774 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.791973 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.792007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.792015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.792028 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.792037 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.894401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.894447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.894460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.894478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.894494 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.996517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.996559 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.996571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.996587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:04 crc kubenswrapper[4762]: I0217 14:06:04.996599 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:04Z","lastTransitionTime":"2026-02-17T14:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.034223 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:36:43.219027761 +0000 UTC Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.070510 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:05 crc kubenswrapper[4762]: E0217 14:06:05.070671 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.098976 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.099014 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.099022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.099038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.099047 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.201481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.201520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.201536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.201552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.201563 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.303924 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.303964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.303973 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.303986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.303996 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.306403 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/2.log" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.306957 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/1.log" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.309098 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2" exitCode=1 Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.309125 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.309169 4762 scope.go:117] "RemoveContainer" containerID="486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.309764 4762 scope.go:117] "RemoveContainer" containerID="cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2" Feb 17 14:06:05 crc kubenswrapper[4762]: E0217 14:06:05.309926 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.322114 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.333538 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.353096 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.367036 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.378180 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.387190 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.400453 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.406047 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.406093 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.406104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.406120 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.406132 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.418072 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.436588 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.447329 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.460321 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.474328 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.485197 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.498449 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.508438 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.508482 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.508491 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.508508 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.508519 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.510473 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.521095 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.529239 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:05Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.611539 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.611603 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.611625 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.611682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.611707 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.714701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.714770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.714791 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.714821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.714843 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.817299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.817348 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.817362 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.817380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.817392 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.919941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.919979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.919988 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.920002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:05 crc kubenswrapper[4762]: I0217 14:06:05.920012 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:05Z","lastTransitionTime":"2026-02-17T14:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.022277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.022318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.022326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.022341 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.022352 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.034865 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 10:18:15.293984968 +0000 UTC Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.070215 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.070215 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.070215 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.070999 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.071084 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.071146 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.124895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.124946 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.124958 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.124975 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.124988 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.227378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.227427 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.227442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.227459 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.227470 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.314757 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/2.log" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.330232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.330302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.330326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.330355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.330375 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.358335 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.358409 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.358425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.358449 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.358466 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.370475 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.374575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.374611 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.374621 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.374661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.374674 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.390233 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.393948 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.393991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.394003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.394022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.394035 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.407040 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.410813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.410858 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.410880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.410907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.410921 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.422993 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.427100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.427140 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.427151 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.427171 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.427180 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.438226 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:06Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:06 crc kubenswrapper[4762]: E0217 14:06:06.438384 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.439967 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.439998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.440011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.440025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.440036 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.542689 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.543146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.543369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.543553 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.543820 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.646155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.646387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.646453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.646521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.646585 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.749395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.749432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.749441 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.749455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.749464 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.852361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.852737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.852838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.852935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.853022 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.955724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.955820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.955831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.955845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:06 crc kubenswrapper[4762]: I0217 14:06:06.955856 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:06Z","lastTransitionTime":"2026-02-17T14:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.036375 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:12:55.451024382 +0000 UTC Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.058371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.058414 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.058425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.058442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.058453 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.070686 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:07 crc kubenswrapper[4762]: E0217 14:06:07.070809 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.161432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.161480 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.161488 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.161503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.161512 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.263235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.263265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.263275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.263288 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.263297 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.365697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.365730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.365739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.365753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.365763 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.468620 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.468681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.468694 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.468712 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.468724 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.570990 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.571039 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.571049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.571066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.571077 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.673396 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.673446 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.673460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.673478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.673490 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.775481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.775509 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.775517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.775530 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.775539 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.877536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.877576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.877587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.877606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.877676 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.980294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.980343 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.980359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.980382 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:07 crc kubenswrapper[4762]: I0217 14:06:07.980398 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:07Z","lastTransitionTime":"2026-02-17T14:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.037156 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 01:54:12.648328129 +0000 UTC Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.070875 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.070916 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:08 crc kubenswrapper[4762]: E0217 14:06:08.071021 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.071094 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:08 crc kubenswrapper[4762]: E0217 14:06:08.071202 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:08 crc kubenswrapper[4762]: E0217 14:06:08.071373 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.082294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.082340 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.082351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.082371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.082382 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.184952 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.184997 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.185007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.185030 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.185482 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.287562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.287610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.287621 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.287638 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.287677 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.390433 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.390479 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.390493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.390508 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.390521 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.492595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.492678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.492697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.492718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.492733 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.595052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.595099 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.595110 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.595135 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.595147 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.697742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.697775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.697784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.697798 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.697808 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.800522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.800571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.800587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.800602 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.800612 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.903838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.903915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.903940 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.903968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:08 crc kubenswrapper[4762]: I0217 14:06:08.903986 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:08Z","lastTransitionTime":"2026-02-17T14:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.007173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.007228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.007243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.007264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.007280 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.037635 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 01:55:26.323896077 +0000 UTC Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.070150 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:09 crc kubenswrapper[4762]: E0217 14:06:09.070262 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.109278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.109322 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.109338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.109356 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.109365 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.211918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.211968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.211979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.211995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.212005 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.314508 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.314566 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.314580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.314597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.314609 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.417296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.417334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.417344 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.417361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.417372 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.519693 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.519733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.519772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.519792 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.519804 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.623150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.623570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.623840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.624007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.624142 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.726393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.726697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.726813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.726902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.726979 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.830864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.830943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.830965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.830995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.831016 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.933539 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.933574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.933584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.933598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:09 crc kubenswrapper[4762]: I0217 14:06:09.933609 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:09Z","lastTransitionTime":"2026-02-17T14:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.036062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.036323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.036411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.036491 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.036561 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.038271 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:13:34.618803002 +0000 UTC Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.070468 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.070511 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.070569 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:10 crc kubenswrapper[4762]: E0217 14:06:10.070634 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:10 crc kubenswrapper[4762]: E0217 14:06:10.070740 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:10 crc kubenswrapper[4762]: E0217 14:06:10.070818 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.083359 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.092864 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.104680 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.118474 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.132165 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.138981 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.139271 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.139355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.139436 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.139515 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.150845 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.167779 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.180848 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.194934 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.206385 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.217245 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.229887 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242082 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242110 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242735 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.242778 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: E0217 14:06:10.242870 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:10 crc kubenswrapper[4762]: E0217 14:06:10.242946 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:06:26.242924997 +0000 UTC m=+66.822925669 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.252171 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.263247 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.278808 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.298531 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:10Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.344770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.344811 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.344821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.344839 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.344853 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.446854 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.446885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.446894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.446907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.446916 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.550964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.551021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.551043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.551071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.551092 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.655095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.655164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.655180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.655201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.655213 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.759034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.759086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.759096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.759117 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.759130 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.861796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.861850 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.861865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.861886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.861902 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.964548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.964575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.964583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.964596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:10 crc kubenswrapper[4762]: I0217 14:06:10.964605 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:10Z","lastTransitionTime":"2026-02-17T14:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.039070 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 08:59:19.934492283 +0000 UTC Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.067108 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.067158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.067168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.067183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.067193 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.070638 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:11 crc kubenswrapper[4762]: E0217 14:06:11.070777 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.169490 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.169535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.169546 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.169565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.169576 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.272545 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.272612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.272634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.272699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.272721 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.375879 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.375945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.375963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.375991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.376006 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.479309 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.479366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.479380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.479396 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.479411 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.581460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.581489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.581497 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.581510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.581518 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.683732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.683813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.683833 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.683859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.683880 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.786623 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.786722 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.786746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.786785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.786809 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.888573 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.888615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.888627 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.888667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.888684 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.961466 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:11 crc kubenswrapper[4762]: E0217 14:06:11.961625 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.961607635 +0000 UTC m=+84.541608297 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.961715 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:11 crc kubenswrapper[4762]: E0217 14:06:11.961801 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:11 crc kubenswrapper[4762]: E0217 14:06:11.961846 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.961837581 +0000 UTC m=+84.541838233 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.991500 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.991545 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.991558 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.991574 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:11 crc kubenswrapper[4762]: I0217 14:06:11.991585 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:11Z","lastTransitionTime":"2026-02-17T14:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.040211 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:06:52.683455066 +0000 UTC Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.062858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.062916 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.062960 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063054 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063071 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063088 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063100 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063135 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:44.063117176 +0000 UTC m=+84.643117828 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063153 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:44.063146177 +0000 UTC m=+84.643146839 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063218 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063267 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063283 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.063342 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:44.063321541 +0000 UTC m=+84.643322203 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.070126 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.070157 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.070235 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.070270 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.070309 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:12 crc kubenswrapper[4762]: E0217 14:06:12.070385 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.093910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.093947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.093959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.093978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.093991 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.196505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.196532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.196540 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.196553 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.196562 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.299338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.299781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.299877 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.299962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.300039 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.402352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.402388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.402400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.402415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.402424 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.505042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.505083 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.505095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.505111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.505123 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.607232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.607276 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.607287 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.607303 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.607314 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.709684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.709722 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.709730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.709745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.709754 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.813533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.813572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.813606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.813622 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.813633 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.916000 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.916048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.916061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.916075 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:12 crc kubenswrapper[4762]: I0217 14:06:12.916085 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:12Z","lastTransitionTime":"2026-02-17T14:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.017703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.017734 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.017742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.017755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.017763 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.040335 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:30:49.693557481 +0000 UTC Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.070700 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:13 crc kubenswrapper[4762]: E0217 14:06:13.070828 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.119917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.119955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.119964 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.119979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.119988 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.222273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.222389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.222403 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.222432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.222448 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.324592 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.324664 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.324683 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.324702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.324717 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.427221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.427313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.427323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.427338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.427348 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.529555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.529600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.529611 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.529628 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.529658 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.632150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.632209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.632220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.632237 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.632250 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.734606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.734634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.734664 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.734676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.734685 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.837046 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.837088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.837101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.837118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.837129 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.939596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.939690 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.939710 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.939733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:13 crc kubenswrapper[4762]: I0217 14:06:13.939747 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:13Z","lastTransitionTime":"2026-02-17T14:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.040845 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 15:50:54.612062863 +0000 UTC Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.042514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.042570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.042582 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.042600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.042613 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.070577 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.070694 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.070721 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:14 crc kubenswrapper[4762]: E0217 14:06:14.070851 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:14 crc kubenswrapper[4762]: E0217 14:06:14.071167 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:14 crc kubenswrapper[4762]: E0217 14:06:14.071311 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.145676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.145735 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.145746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.145767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.145781 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.248808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.248872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.248893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.248920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.248941 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.351303 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.351359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.351370 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.351400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.351413 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.454216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.454277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.454295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.454316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.454377 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.556936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.556978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.556988 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.557172 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.557389 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.625506 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.639702 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.642407 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.652487 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.660975 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.661021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.661030 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.661043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.661053 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.662887 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.675757 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.690294 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.702550 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.719944 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.732158 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.744843 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.758336 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.763066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.763108 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.763121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.763136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.763147 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.769676 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.781824 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.793500 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.804054 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.822962 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486010777d56a6abb96d278afe402a89cb5b0c06b3656e95fdc009e25783eecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:05:51Z\\\",\\\"message\\\":\\\"shift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:05:51.328856 6214 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0217 14:05:51.328931 6214 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.834497 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.847727 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.865360 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.865413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.865439 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.865454 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.865462 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.968831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.968889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.968899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.968918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:14 crc kubenswrapper[4762]: I0217 14:06:14.968931 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:14Z","lastTransitionTime":"2026-02-17T14:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.041157 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 02:58:02.873352457 +0000 UTC Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.070225 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:15 crc kubenswrapper[4762]: E0217 14:06:15.070392 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.071166 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.071208 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.071218 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.071235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.071244 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.173708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.173751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.173760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.173776 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.173785 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.275955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.275985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.275993 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.276006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.276015 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.379301 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.379375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.379399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.379428 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.379450 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.482401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.482437 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.482445 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.482458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.482466 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.584487 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.584539 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.584553 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.584572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.584582 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.686766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.686796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.686803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.686816 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.686824 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.789549 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.789626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.789638 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.789718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.789735 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.893004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.893053 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.893065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.893086 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.893099 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.995308 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.995355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.995370 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.995386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:15 crc kubenswrapper[4762]: I0217 14:06:15.995400 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:15Z","lastTransitionTime":"2026-02-17T14:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.042226 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 13:48:55.345249817 +0000 UTC Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.070932 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.071006 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.071082 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.071093 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.071165 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.071313 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.101829 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.101910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.101924 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.101949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.101963 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.205697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.205782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.205808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.205841 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.205866 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.308177 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.308211 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.308221 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.308235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.308246 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.410725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.410762 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.410773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.410795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.410806 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.512991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.513025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.513035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.513049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.513059 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.616609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.617214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.617264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.617293 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.617306 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.720196 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.720241 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.720254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.720270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.720279 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.822826 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.822873 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.822887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.822907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.822919 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.830991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.831034 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.831042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.831056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.831065 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.843289 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.846827 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.846864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.846875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.846890 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.846902 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.860163 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.864875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.864914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.864943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.864959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.864973 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.877303 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.883626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.883686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.883699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.883718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.883731 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.896613 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.901522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.901579 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.901594 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.901616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.901627 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.916927 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:16 crc kubenswrapper[4762]: E0217 14:06:16.917110 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.925888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.925927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.925939 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.925959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:16 crc kubenswrapper[4762]: I0217 14:06:16.925970 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:16Z","lastTransitionTime":"2026-02-17T14:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.029200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.029253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.029265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.029287 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.029309 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.043323 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 14:15:15.888205297 +0000 UTC Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.070889 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:17 crc kubenswrapper[4762]: E0217 14:06:17.071091 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.132667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.132708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.132718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.132736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.132748 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.235521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.235568 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.235583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.235613 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.235628 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.337889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.337921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.337928 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.337942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.337951 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.440398 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.440431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.440440 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.440453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.440462 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.542526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.542563 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.542571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.542586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.542595 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.647368 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.647401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.647412 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.647427 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.647437 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.749820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.749858 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.749868 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.749883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.749893 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.851960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.852023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.852058 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.852121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.852145 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.954462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.954908 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.955074 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.955226 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:17 crc kubenswrapper[4762]: I0217 14:06:17.955369 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:17Z","lastTransitionTime":"2026-02-17T14:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.043491 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 03:59:33.238455172 +0000 UTC Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.058225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.058450 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.058534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.058627 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.058738 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.070799 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.070817 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.070912 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:18 crc kubenswrapper[4762]: E0217 14:06:18.071039 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:18 crc kubenswrapper[4762]: E0217 14:06:18.071094 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:18 crc kubenswrapper[4762]: E0217 14:06:18.071133 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.162580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.162901 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.163053 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.163198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.163324 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.266095 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.266148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.266160 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.266180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.266195 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.368269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.368314 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.368328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.368346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.368360 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.471269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.471325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.471338 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.471357 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.471370 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.573192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.573424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.573550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.573660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.573794 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.677198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.677244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.677260 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.677280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.677292 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.780219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.780306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.780331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.780361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.780383 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.882756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.882862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.882881 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.882905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.882923 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.985161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.985216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.985228 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.985242 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:18 crc kubenswrapper[4762]: I0217 14:06:18.985252 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:18Z","lastTransitionTime":"2026-02-17T14:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.044216 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 11:28:06.705004966 +0000 UTC Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.070809 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:19 crc kubenswrapper[4762]: E0217 14:06:19.071017 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.072800 4762 scope.go:117] "RemoveContainer" containerID="cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2" Feb 17 14:06:19 crc kubenswrapper[4762]: E0217 14:06:19.073163 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.085876 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.088185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.088238 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.088274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.088294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.088305 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.121118 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.134215 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.147448 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.158461 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.169597 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.183302 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.190698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.190744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.190752 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.190765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.190774 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.195383 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.206621 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.215974 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.228091 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.238790 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.252494 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.262889 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.281000 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.292888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.292919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.292931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.292947 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.292958 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.293890 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.303950 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.313891 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.396590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.396626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.396637 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.396681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.396692 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.498814 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.498852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.498862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.498878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.498891 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.601357 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.601401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.601413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.601430 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.601442 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.703793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.703828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.703837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.703853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.703862 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.806836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.806874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.806883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.806896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.806905 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.909329 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.909379 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.909396 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.909424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:19 crc kubenswrapper[4762]: I0217 14:06:19.909442 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:19Z","lastTransitionTime":"2026-02-17T14:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.012085 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.012129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.012141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.012155 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.012167 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.045108 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 03:56:00.535343782 +0000 UTC Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.069852 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.069919 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.069862 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:20 crc kubenswrapper[4762]: E0217 14:06:20.070026 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:20 crc kubenswrapper[4762]: E0217 14:06:20.070150 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:20 crc kubenswrapper[4762]: E0217 14:06:20.070270 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.088161 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.103314 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.114475 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.114526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.114537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.114553 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.114565 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.118090 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.138220 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.151732 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.162959 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.175314 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.189585 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.207748 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.217100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.217134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.217144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.217161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.217174 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.218219 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.231755 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.246385 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.257147 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.269552 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.282391 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.295870 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.305540 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.318993 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.319070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.319156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.319180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.319209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.319234 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.421918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.422361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.422461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.422533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.422589 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.525076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.525105 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.525115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.525128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.525136 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.627981 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.628022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.628036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.628057 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.628072 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.730026 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.730094 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.730107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.730123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.730136 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.832840 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.832878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.832886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.832901 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.832913 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.935577 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.935850 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.935919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.935984 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:20 crc kubenswrapper[4762]: I0217 14:06:20.936045 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:20Z","lastTransitionTime":"2026-02-17T14:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.039009 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.039060 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.039076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.039100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.039117 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.045214 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 12:21:29.639963046 +0000 UTC Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.070405 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:21 crc kubenswrapper[4762]: E0217 14:06:21.070552 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.141049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.141089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.141101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.141141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.141153 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.243603 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.243636 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.243663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.243678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.243693 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.346092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.346345 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.346432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.346519 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.346596 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.449503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.449535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.449544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.449559 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.449568 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.552269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.552599 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.552872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.553096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.553282 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.655851 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.655902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.655914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.655931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.655943 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.758493 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.758829 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.758915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.759035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.759119 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.861773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.862070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.862143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.862239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.862303 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.965290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.965334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.965346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.965364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:21 crc kubenswrapper[4762]: I0217 14:06:21.965375 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:21Z","lastTransitionTime":"2026-02-17T14:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.045401 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 04:28:01.410002738 +0000 UTC Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.067307 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.067342 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.067351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.067365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.067377 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.070598 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.070601 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:22 crc kubenswrapper[4762]: E0217 14:06:22.070727 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.070745 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:22 crc kubenswrapper[4762]: E0217 14:06:22.070808 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:22 crc kubenswrapper[4762]: E0217 14:06:22.070875 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.170770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.170809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.170820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.170845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.170860 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.273676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.273706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.273717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.273733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.273744 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.375415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.375451 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.375463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.375479 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.375491 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.477224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.477260 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.477268 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.477280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.477289 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.579969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.580006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.580015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.580031 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.580040 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.682554 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.682584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.682593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.682608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.682619 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.785265 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.785317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.785364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.785384 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.785395 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.887575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.887624 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.887676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.887705 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.887727 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.990008 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.990052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.990067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.990128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:22 crc kubenswrapper[4762]: I0217 14:06:22.990157 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:22Z","lastTransitionTime":"2026-02-17T14:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.046484 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 08:35:06.477477239 +0000 UTC Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.070089 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:23 crc kubenswrapper[4762]: E0217 14:06:23.070185 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.092918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.092950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.092960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.092976 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.092988 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.195214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.195294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.195306 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.195325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.195336 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.297712 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.297748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.297758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.297770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.297780 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.400073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.400327 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.400420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.400512 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.400591 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.504063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.504117 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.504131 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.504146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.504157 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.606450 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.606781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.606894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.606998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.607099 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.709825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.710103 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.710190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.710276 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.710354 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.813505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.813554 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.813565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.813581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.813595 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.915831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.915911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.915931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.915959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:23 crc kubenswrapper[4762]: I0217 14:06:23.915982 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:23Z","lastTransitionTime":"2026-02-17T14:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.018489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.018528 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.018537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.018550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.018560 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.046926 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 19:45:56.182808489 +0000 UTC Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.070337 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:24 crc kubenswrapper[4762]: E0217 14:06:24.070481 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.070539 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.070341 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:24 crc kubenswrapper[4762]: E0217 14:06:24.070659 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:24 crc kubenswrapper[4762]: E0217 14:06:24.070691 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.121274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.121316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.121323 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.121377 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.121387 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.223600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.223635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.223660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.223674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.223683 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.325385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.325419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.325427 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.325441 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.325450 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.428098 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.428138 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.428169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.428183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.428191 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.529942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.529979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.529987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.530001 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.530014 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.632683 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.632737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.632756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.632777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.632794 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.735618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.735896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.735985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.736107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.736202 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.838883 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.838923 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.838934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.838951 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.838964 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.941156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.941258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.941288 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.941317 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:24 crc kubenswrapper[4762]: I0217 14:06:24.941338 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:24Z","lastTransitionTime":"2026-02-17T14:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.043902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.043936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.043945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.043960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.043970 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.047365 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 19:36:07.278019399 +0000 UTC Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.070677 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:25 crc kubenswrapper[4762]: E0217 14:06:25.070812 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.146324 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.146358 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.146366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.146380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.146391 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.248936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.248977 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.248985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.248998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.249007 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.351845 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.351905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.351918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.351933 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.351944 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.454217 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.454302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.454310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.454325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.454334 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.590249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.590300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.590311 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.590327 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.590341 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.692717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.692743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.692753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.692765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.692773 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.795066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.795109 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.795120 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.795134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.795144 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.897141 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.897425 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.897530 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.897634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:25 crc kubenswrapper[4762]: I0217 14:06:25.897750 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:25Z","lastTransitionTime":"2026-02-17T14:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.000055 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.000126 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.000143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.000170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.000187 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.047683 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 17:43:34.259766701 +0000 UTC Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.070136 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.070200 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:26 crc kubenswrapper[4762]: E0217 14:06:26.070298 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.070136 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:26 crc kubenswrapper[4762]: E0217 14:06:26.070408 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:26 crc kubenswrapper[4762]: E0217 14:06:26.070588 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.102609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.102678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.102691 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.102718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.102731 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.205880 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.205925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.205934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.205955 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.205967 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.307144 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:26 crc kubenswrapper[4762]: E0217 14:06:26.307359 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:26 crc kubenswrapper[4762]: E0217 14:06:26.307432 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:06:58.307409239 +0000 UTC m=+98.887409891 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.308670 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.308715 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.308774 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.308809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.308829 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.411164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.411238 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.411252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.411277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.411291 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.514011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.514048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.514058 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.514071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.514081 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.616571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.616703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.616716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.616739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.616753 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.720042 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.720096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.720107 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.720127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.720138 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.822706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.822747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.822759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.822778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.822795 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.925568 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.925607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.925616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.925630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:26 crc kubenswrapper[4762]: I0217 14:06:26.925656 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:26Z","lastTransitionTime":"2026-02-17T14:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.028673 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.028723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.028736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.028761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.028775 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.047970 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 17:32:36.308204098 +0000 UTC Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.070336 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.070476 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.089207 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.089254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.089263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.089280 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.089297 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.101472 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:27Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.105248 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.105278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.105290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.105309 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.105322 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.116326 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:27Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.119545 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.119583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.119594 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.119607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.119617 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.131083 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:27Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.136311 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.136491 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.136600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.136725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.136844 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.148728 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:27Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.152463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.152514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.152526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.152544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.152556 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.163730 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:27Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:27 crc kubenswrapper[4762]: E0217 14:06:27.164108 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.165592 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.165617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.165627 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.165655 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.165668 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.268214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.268254 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.268267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.268282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.268293 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.370691 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.370757 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.370769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.370794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.370811 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.473162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.473199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.473207 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.473255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.473267 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.575743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.575786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.575803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.575825 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.575836 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.678185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.678241 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.678252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.678269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.678280 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.780462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.780500 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.780510 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.780534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.780547 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.882702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.882747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.882758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.882773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.882784 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.985219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.985284 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.985296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.985311 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:27 crc kubenswrapper[4762]: I0217 14:06:27.985322 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:27Z","lastTransitionTime":"2026-02-17T14:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.048482 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 15:24:57.198564753 +0000 UTC Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.070872 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.070927 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:28 crc kubenswrapper[4762]: E0217 14:06:28.070997 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.071011 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:28 crc kubenswrapper[4762]: E0217 14:06:28.071083 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:28 crc kubenswrapper[4762]: E0217 14:06:28.071186 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.080980 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.087182 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.087215 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.087225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.087239 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.087250 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.189540 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.189587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.189598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.189612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.189621 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.292518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.292815 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.292908 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.293003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.293084 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.382156 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/0.log" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.382202 4762 generic.go:334] "Generic (PLEG): container finished" podID="c1057884-d2c5-4911-9b97-fb4fedba9ab1" containerID="1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f" exitCode=1 Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.382599 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerDied","Data":"1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.383332 4762 scope.go:117] "RemoveContainer" containerID="1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.396161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.396198 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.396208 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.396224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.396235 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.398980 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.411589 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.422178 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.441720 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.452382 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.464395 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.477562 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.487214 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.499393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.499628 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.499750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.499832 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.499925 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.499746 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.508019 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.517317 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.532577 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.541618 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.552528 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.563890 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.574001 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.589455 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.601022 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.602564 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.602618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.602629 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.602666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.602678 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.614812 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:28Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.704625 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.704687 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.704699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.704714 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.704769 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.806835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.806869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.806878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.806892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.806902 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.909781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.909853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.909879 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.909902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:28 crc kubenswrapper[4762]: I0217 14:06:28.909919 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:28Z","lastTransitionTime":"2026-02-17T14:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.012570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.012616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.012626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.012653 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.012662 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.049436 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 21:51:45.068449137 +0000 UTC Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.069933 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:29 crc kubenswrapper[4762]: E0217 14:06:29.070067 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.115056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.115091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.115099 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.115113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.115123 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.217157 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.217195 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.217206 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.217222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.217233 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.318859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.318887 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.318895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.318907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.318915 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.385614 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/0.log" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.385677 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerStarted","Data":"97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.400432 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.411953 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.421328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.421368 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.421378 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.421394 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.421405 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.422696 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.430831 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.440757 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.451522 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.459896 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.471829 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.482554 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.491485 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.506277 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.517929 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.523105 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.523163 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.523176 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.523193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.523204 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.526401 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.537449 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.550450 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.561702 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.578862 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.589416 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.602085 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:29Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.625696 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.625729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.625737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.625749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.625758 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.727960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.728025 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.728043 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.728066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.728083 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.829965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.830012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.830022 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.830038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.830050 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.931974 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.932011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.932019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.932033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:29 crc kubenswrapper[4762]: I0217 14:06:29.932046 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:29Z","lastTransitionTime":"2026-02-17T14:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.041442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.041519 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.041539 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.041566 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.041593 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.049655 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 20:26:06.339798884 +0000 UTC Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.070243 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.070346 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.070357 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:30 crc kubenswrapper[4762]: E0217 14:06:30.070447 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:30 crc kubenswrapper[4762]: E0217 14:06:30.070543 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:30 crc kubenswrapper[4762]: E0217 14:06:30.070677 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.082853 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.093327 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.115723 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.128612 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.143244 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.145081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.145110 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.145122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.145138 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.145149 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.156180 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.171556 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.183438 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.195814 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.206634 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.260345 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.261389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.261418 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.261429 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.261445 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.261456 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.271849 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.281460 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.292775 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.302785 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.320970 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.333849 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.342338 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.353917 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.363782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.363814 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.363823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.363836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.363849 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.466122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.466193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.466209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.466235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.466252 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.568609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.568666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.568674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.568688 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.568697 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.670945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.670996 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.671008 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.671027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.671076 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.773497 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.773535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.773547 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.773562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.773573 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.875941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.875980 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.875992 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.876010 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.876020 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.977994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.978035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.978046 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.978061 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:30 crc kubenswrapper[4762]: I0217 14:06:30.978072 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:30Z","lastTransitionTime":"2026-02-17T14:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.050184 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 12:43:34.510785702 +0000 UTC Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.070728 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:31 crc kubenswrapper[4762]: E0217 14:06:31.070881 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.082029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.082096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.082113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.082154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.082173 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.184660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.184710 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.184723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.184742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.184758 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.287818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.288153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.288267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.288367 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.288453 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.389878 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.390106 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.390188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.390274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.390355 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.492787 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.492823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.492833 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.492847 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.492857 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.595297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.595329 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.595337 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.595349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.595360 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.697589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.697630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.697653 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.697668 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.697676 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.800000 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.800720 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.800733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.800747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.800755 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.902919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.902971 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.902982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.902997 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:31 crc kubenswrapper[4762]: I0217 14:06:31.903006 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:31Z","lastTransitionTime":"2026-02-17T14:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.005146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.005188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.005201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.005219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.005231 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.050718 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 07:28:29.049221799 +0000 UTC Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.070335 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.070400 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.070437 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:32 crc kubenswrapper[4762]: E0217 14:06:32.070545 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:32 crc kubenswrapper[4762]: E0217 14:06:32.070627 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:32 crc kubenswrapper[4762]: E0217 14:06:32.070697 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.071334 4762 scope.go:117] "RemoveContainer" containerID="cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.107389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.107426 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.107434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.107447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.107456 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.209618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.209691 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.209708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.209728 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.209740 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.311761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.311811 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.311823 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.311842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.311854 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.395424 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/2.log" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.398758 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.399147 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.413310 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.415670 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.415761 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.415782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.415805 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.415825 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.428127 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.444670 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.456460 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.471205 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.483945 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.497792 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.510587 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.517835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.517855 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.517864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.517877 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.517886 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.526754 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.539346 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.565400 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.592948 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.610173 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.618290 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.620115 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.620146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.620156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.620169 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.620177 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.628902 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.638972 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.658436 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.668127 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.679982 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.723076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.723124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.723137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.723154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.723165 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.825139 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.825183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.825192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.825207 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.825215 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.928016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.928055 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.928066 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.928081 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:32 crc kubenswrapper[4762]: I0217 14:06:32.928091 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:32Z","lastTransitionTime":"2026-02-17T14:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.030328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.030364 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.030375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.030389 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.030400 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.051294 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 04:39:25.589415501 +0000 UTC Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.070811 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:33 crc kubenswrapper[4762]: E0217 14:06:33.070937 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.132874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.132910 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.132920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.132932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.132942 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.235930 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.235968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.235978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.235994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.236006 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.338225 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.338285 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.338297 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.338313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.338323 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.404042 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/3.log" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.404635 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/2.log" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.406977 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" exitCode=1 Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.407019 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.407050 4762 scope.go:117] "RemoveContainer" containerID="cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.407600 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:06:33 crc kubenswrapper[4762]: E0217 14:06:33.407913 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.421140 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.431979 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.440689 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.440741 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.440753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.440770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.440785 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.444300 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.458700 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.468187 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.479940 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.490635 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.500302 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.517107 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf17713546c5c2de874b280f95fe3bc0983239a935a90c664f797e5b712459e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:05Z\\\",\\\"message\\\":\\\"BOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.110\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 14:06:04.997949 6437 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997958 6437 services_controller.go:453] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics template LB for network=default: []services.LB{}\\\\nI0217 14:06:04.997966 6437 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0217 14:06:04.997536 6437 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:32Z\\\",\\\"message\\\":\\\"7 14:06:32.956311 6840 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0217 14:06:32.956341 6840 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0217 14:06:32.956366 6840 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956384 6840 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956393 6840 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-xpj6v in node crc\\\\nI0217 14:06:32.956399 6840 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v after 0 failed attempt(s)\\\\nI0217 14:06:32.956405 6840 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956425 6840 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:06:32.956485 6840 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.527718 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.538779 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.542158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.542179 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.542188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.542199 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.542208 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.548535 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.559034 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.568330 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.584814 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.594401 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.608310 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.643949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.644219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.644299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.644384 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.644467 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.649035 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.659336 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.746969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.747012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.747023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.747041 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.747053 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.849635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.849698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.849706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.849719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.849728 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.954124 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.954186 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.954203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.954224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:33 crc kubenswrapper[4762]: I0217 14:06:33.954248 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:33Z","lastTransitionTime":"2026-02-17T14:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.051696 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 15:10:02.31095269 +0000 UTC Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.056619 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.056674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.056685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.056700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.056711 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.070160 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.070177 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.070255 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:34 crc kubenswrapper[4762]: E0217 14:06:34.070354 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:34 crc kubenswrapper[4762]: E0217 14:06:34.070461 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:34 crc kubenswrapper[4762]: E0217 14:06:34.070550 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.158921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.158970 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.158983 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.159001 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.159014 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.261948 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.262029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.262058 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.262089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.262114 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.364162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.364379 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.364511 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.364633 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.364784 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.412075 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/3.log" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.416609 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:06:34 crc kubenswrapper[4762]: E0217 14:06:34.418243 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.430278 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.442784 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.453101 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.467635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.467882 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.468052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.468240 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.468333 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.471158 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.483657 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.495574 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.510947 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.522364 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.535725 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.546743 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.558143 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.571316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.571407 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.571504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.571634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.571666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.571678 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.581776 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.593561 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.604086 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.613940 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.629616 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:32Z\\\",\\\"message\\\":\\\"7 14:06:32.956311 6840 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0217 14:06:32.956341 6840 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0217 14:06:32.956366 6840 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956384 6840 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956393 6840 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-xpj6v in node crc\\\\nI0217 14:06:32.956399 6840 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v after 0 failed attempt(s)\\\\nI0217 14:06:32.956405 6840 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956425 6840 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:06:32.956485 6840 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.640735 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.655313 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.674472 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.674665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.674784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.675010 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.675176 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.777676 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.777717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.777728 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.777743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.777754 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.880180 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.880485 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.880548 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.880675 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.880757 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.982859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.982907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.982917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.982935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:34 crc kubenswrapper[4762]: I0217 14:06:34.982947 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:34Z","lastTransitionTime":"2026-02-17T14:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.052632 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 15:13:21.059940661 +0000 UTC Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.069878 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:35 crc kubenswrapper[4762]: E0217 14:06:35.069978 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.085018 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.085226 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.085404 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.085569 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.085683 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.188267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.188313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.188322 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.188340 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.188350 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.290515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.291153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.291171 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.291380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.291391 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.393972 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.394024 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.394038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.394055 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.394066 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.497421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.497501 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.497515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.497536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.497549 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.599692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.599732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.599741 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.599753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.599762 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.702737 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.702775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.702785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.702803 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.702814 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.805112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.805162 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.805278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.805294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.805302 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.908065 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.908094 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.908102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.908116 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:35 crc kubenswrapper[4762]: I0217 14:06:35.908126 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:35Z","lastTransitionTime":"2026-02-17T14:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.010340 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.010562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.010620 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.010740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.010816 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.053599 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:39:13.785070867 +0000 UTC Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.070106 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.070120 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.070294 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:36 crc kubenswrapper[4762]: E0217 14:06:36.070701 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:36 crc kubenswrapper[4762]: E0217 14:06:36.070842 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:36 crc kubenswrapper[4762]: E0217 14:06:36.071109 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.113035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.113063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.113072 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.113089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.113098 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.215701 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.215748 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.215759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.215779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.215794 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.318525 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.318861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.318963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.319052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.319146 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.421732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.421802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.421822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.421847 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.421865 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.524613 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.524687 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.524699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.524715 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.524726 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.627193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.627220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.627251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.627266 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.627275 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.729739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.729781 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.729793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.729810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.729821 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.832203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.832234 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.832244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.832258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.832268 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.934409 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.934665 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.934822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.934935 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:36 crc kubenswrapper[4762]: I0217 14:06:36.935038 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:36Z","lastTransitionTime":"2026-02-17T14:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.037628 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.037696 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.037708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.037724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.037735 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.053965 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:13:57.915697547 +0000 UTC Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.070362 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.070485 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.140330 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.140629 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.140758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.140874 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.140992 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.243933 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.243976 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.244011 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.244031 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.244042 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.346353 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.346686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.346820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.346922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.347021 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.449584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.449632 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.449669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.449685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.449694 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.552090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.552122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.552130 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.552143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.552152 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.553422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.553457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.553466 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.553474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.553481 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.566173 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.571395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.571492 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.571518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.571552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.571577 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.596318 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.600318 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.600345 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.600375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.600393 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.600402 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.616546 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.620758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.620795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.620802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.620818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.620836 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.634841 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.638371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.638527 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.638597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.638682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.638748 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.649781 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4762]: E0217 14:06:37.650002 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.653895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.653946 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.653957 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.653976 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.654005 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.756623 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.756682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.756692 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.756706 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.756715 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.858969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.859023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.859231 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.859249 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.859262 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.961755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.961802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.961810 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.961824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:37 crc kubenswrapper[4762]: I0217 14:06:37.961834 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:37Z","lastTransitionTime":"2026-02-17T14:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.054991 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 17:53:21.666581334 +0000 UTC Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.064200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.064257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.064277 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.064302 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.064321 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.070547 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.070587 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:38 crc kubenswrapper[4762]: E0217 14:06:38.070768 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.070803 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:38 crc kubenswrapper[4762]: E0217 14:06:38.070930 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:38 crc kubenswrapper[4762]: E0217 14:06:38.071043 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.167366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.167409 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.167420 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.167436 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.167448 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.270012 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.270062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.270084 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.270112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.270131 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.372319 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.372362 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.372372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.372386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.372395 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.474568 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.474863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.474949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.475032 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.475126 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.577679 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.577719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.577728 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.577746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.577759 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.679584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.679839 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.679909 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.680004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.680096 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.782226 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.782251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.782263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.782278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.782288 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.884677 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.884972 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.885069 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.885171 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.885259 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.986903 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.987153 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.987257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.987365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:38 crc kubenswrapper[4762]: I0217 14:06:38.987432 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:38Z","lastTransitionTime":"2026-02-17T14:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.055763 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:11:37.191334866 +0000 UTC Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.070106 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:39 crc kubenswrapper[4762]: E0217 14:06:39.070271 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.089743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.089780 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.089788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.089801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.089810 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.192253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.192486 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.192666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.192817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.192902 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.295123 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.295163 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.295173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.295189 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.295200 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.396942 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.396968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.396975 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.396987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.396995 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.498804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.498859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.498869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.498886 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.498894 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.601062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.601101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.601111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.601125 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.601134 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.704064 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.704137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.704150 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.704168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.704183 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.810278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.810957 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.810984 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.811015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.811037 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.914076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.914117 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.914127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.914143 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:39 crc kubenswrapper[4762]: I0217 14:06:39.914153 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:39Z","lastTransitionTime":"2026-02-17T14:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.016543 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.016585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.016615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.016632 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.016665 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.056998 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 18:03:54.253850392 +0000 UTC Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.070316 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.070395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.070398 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:40 crc kubenswrapper[4762]: E0217 14:06:40.070498 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:40 crc kubenswrapper[4762]: E0217 14:06:40.070558 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:40 crc kubenswrapper[4762]: E0217 14:06:40.070731 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.081764 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.094107 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.108463 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.118096 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.118134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.118144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.118161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.118170 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.125235 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:32Z\\\",\\\"message\\\":\\\"7 14:06:32.956311 6840 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0217 14:06:32.956341 6840 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0217 14:06:32.956366 6840 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956384 6840 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956393 6840 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-xpj6v in node crc\\\\nI0217 14:06:32.956399 6840 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v after 0 failed attempt(s)\\\\nI0217 14:06:32.956405 6840 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956425 6840 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:06:32.956485 6840 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.138590 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.148596 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.159269 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.175264 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.188451 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.198133 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.216850 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.220295 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.220331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.220340 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.220354 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.220364 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.227674 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.240066 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.253356 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.264406 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.280175 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.291275 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.301814 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.311468 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.322609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.322657 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.322669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.322685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.322699 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.424679 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.424716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.424724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.424739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.424748 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.527817 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.528571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.528659 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.528702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.528713 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.633684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.633721 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.633730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.633742 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.633751 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.735997 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.736029 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.736037 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.736050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.736058 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.838101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.838215 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.838242 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.838259 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.838271 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.941076 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.941154 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.941167 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.941183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:40 crc kubenswrapper[4762]: I0217 14:06:40.941193 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:40Z","lastTransitionTime":"2026-02-17T14:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.043583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.043625 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.043635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.043668 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.043680 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.058004 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 15:01:30.065239674 +0000 UTC Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.070526 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:41 crc kubenswrapper[4762]: E0217 14:06:41.070702 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.146552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.146587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.146595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.146609 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.146618 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.249205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.249245 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.249256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.249272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.249282 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.351867 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.351906 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.351915 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.351929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.351939 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.453745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.453783 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.453791 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.453805 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.453816 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.555292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.555328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.555336 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.555349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.555358 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.658806 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.658857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.658875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.658902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.658919 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.761328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.761371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.761380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.761395 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.761406 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.863869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.863896 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.863904 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.863921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.863982 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.966522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.966561 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.966570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.966583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4762]: I0217 14:06:41.966592 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.058622 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 22:15:14.522638288 +0000 UTC Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.069316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.069356 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.069366 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.069386 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.069396 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.070114 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.070151 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.070152 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:42 crc kubenswrapper[4762]: E0217 14:06:42.070227 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:42 crc kubenswrapper[4762]: E0217 14:06:42.070280 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:42 crc kubenswrapper[4762]: E0217 14:06:42.070394 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.171611 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.171669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.171683 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.171717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.171730 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.274486 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.274550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.274567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.274590 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.274606 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.377112 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.377161 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.377172 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.377191 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.377204 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.479934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.480035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.480090 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.480122 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.480159 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.583290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.583357 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.583368 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.583388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.583401 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.686610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.686678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.686687 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.686707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.686719 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.789870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.789919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.789930 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.789950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.789970 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.892576 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.892614 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.892625 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.892671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.892683 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.995970 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.996010 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.996021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.996038 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4762]: I0217 14:06:42.996050 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.059070 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 12:01:49.444127593 +0000 UTC Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.070128 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:43 crc kubenswrapper[4762]: E0217 14:06:43.070271 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.097684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.097726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.097735 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.097750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.097761 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.199455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.199502 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.199516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.199541 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.199553 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.301945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.301982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.301991 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.302006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.302015 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.405479 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.405521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.405532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.405555 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.405567 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.508217 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.508260 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.508270 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.508285 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.508296 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.611394 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.611431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.611442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.611457 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.611468 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.713795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.713824 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.713831 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.713846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.713854 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.815718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.815762 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.815774 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.815790 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.815800 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.917865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.917894 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.917902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.917916 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.917925 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.995068 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:43 crc kubenswrapper[4762]: I0217 14:06:43.995196 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:43 crc kubenswrapper[4762]: E0217 14:06:43.995268 4762 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:43 crc kubenswrapper[4762]: E0217 14:06:43.995318 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.995299359 +0000 UTC m=+148.575300011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:43 crc kubenswrapper[4762]: E0217 14:06:43.995529 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.995515195 +0000 UTC m=+148.575515847 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.020135 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.020172 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.020181 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.020196 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.020212 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.060001 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:53:21.048365681 +0000 UTC Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.070491 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.070524 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.070748 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.070843 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.070889 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.071068 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.096364 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.096434 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.096485 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096600 4762 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096601 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096627 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096638 4762 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096704 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.096688985 +0000 UTC m=+148.676689637 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096719 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.096713225 +0000 UTC m=+148.676713877 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096767 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096802 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096821 4762 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:44 crc kubenswrapper[4762]: E0217 14:06:44.096896 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.09687224 +0000 UTC m=+148.676872922 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.122877 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.122938 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.122956 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.122982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.122998 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.226283 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.226355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.226371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.226394 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.226411 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.329559 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.329606 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.329619 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.329635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.329664 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.431925 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.431977 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.431988 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.432006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.432018 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.533982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.534009 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.534019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.534031 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.534040 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.636292 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.636335 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.636347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.636365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.636377 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.738385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.738444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.738453 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.738465 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.738475 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.840291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.840331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.840342 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.840357 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.840368 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.942267 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.942534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.942619 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.942778 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4762]: I0217 14:06:44.942804 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.044808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.044852 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.044862 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.044885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.044897 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.061146 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 15:16:14.748319701 +0000 UTC Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.070534 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:45 crc kubenswrapper[4762]: E0217 14:06:45.070655 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.315846 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.315882 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.315890 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.315906 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.315916 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.417828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.417866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.417875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.417891 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.417908 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.520335 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.520621 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.520630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.520666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.520676 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.622411 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.622486 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.622504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.622528 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.622544 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.724596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.724635 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.724667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.724691 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.724703 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.826853 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.826897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.826908 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.826922 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.826931 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.928581 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.928626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.928661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.928682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4762]: I0217 14:06:45.928698 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.030584 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.030668 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.030684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.030705 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.030719 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.062165 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 20:31:45.597907096 +0000 UTC Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.070530 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.070583 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.070625 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:46 crc kubenswrapper[4762]: E0217 14:06:46.070758 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:46 crc kubenswrapper[4762]: E0217 14:06:46.070846 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:46 crc kubenswrapper[4762]: E0217 14:06:46.071168 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.071448 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:06:46 crc kubenswrapper[4762]: E0217 14:06:46.071669 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.136678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.136731 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.136744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.136764 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.136777 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.239551 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.239602 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.239613 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.239628 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.239637 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.341760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.341806 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.341822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.341844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.341858 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.445258 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.445308 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.445325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.445348 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.445365 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.548129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.548174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.548185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.548205 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.548218 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.650807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.650835 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.650843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.650857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.650868 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.753419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.753450 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.753458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.753473 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.753483 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.855968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.856291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.856432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.856550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.856695 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.959372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.959616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.959772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.959859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4762]: I0217 14:06:46.959922 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.062334 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:58:29.701930438 +0000 UTC Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.062801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.062834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.062843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.062860 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.062870 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.070256 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.070348 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.165582 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.165674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.165697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.165727 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.165750 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.269634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.269738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.269797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.269822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.269877 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.372414 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.372454 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.372463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.372479 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.372489 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.474849 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.474888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.474899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.474914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.474924 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.578035 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.578144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.578170 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.578201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.578223 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.682175 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.682227 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.682243 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.682261 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.682275 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.773416 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.773452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.773462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.773478 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.773489 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.792623 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.796362 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.796399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.796407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.796422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.796433 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.810358 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.813884 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.813919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.813929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.813945 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.813957 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.826168 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.829579 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.829618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.829626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.829654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.829664 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.841713 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.845456 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.845503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.845514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.845529 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.845540 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.864567 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:47Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:47 crc kubenswrapper[4762]: E0217 14:06:47.864708 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.866368 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.866412 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.866424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.866444 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.866486 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.968932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.969002 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.969026 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.969059 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4762]: I0217 14:06:47.969087 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.062448 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 17:44:10.791590724 +0000 UTC Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.069897 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:48 crc kubenswrapper[4762]: E0217 14:06:48.070026 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.070257 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:48 crc kubenswrapper[4762]: E0217 14:06:48.070416 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.070473 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:48 crc kubenswrapper[4762]: E0217 14:06:48.070810 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.071316 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.071374 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.071388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.071406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.071420 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.173946 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.173995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.174005 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.174021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.174054 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.275917 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.275959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.275968 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.275985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.275995 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.378941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.378984 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.378992 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.379007 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.379017 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.482047 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.482100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.482110 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.482127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.482140 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.585203 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.585262 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.585272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.585288 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.585299 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.687272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.687320 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.687331 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.687352 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.687363 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.789931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.789974 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.789989 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.790003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.790013 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.892598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.892669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.892684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.892700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.892711 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.994592 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.994667 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.994680 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.994700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4762]: I0217 14:06:48.994712 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.063583 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 02:49:40.044197115 +0000 UTC Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.069813 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:49 crc kubenswrapper[4762]: E0217 14:06:49.069933 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.097730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.097775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.097786 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.097801 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.097811 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.199776 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.199861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.199892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.199921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.199943 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.302217 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.302248 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.302256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.302269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.302279 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.405194 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.405252 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.405263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.405282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.405296 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.507253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.507299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.507313 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.507334 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.507348 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.609499 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.609536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.609544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.609557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.609567 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.711974 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.712073 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.712088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.712103 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.712114 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.815062 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.815119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.815135 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.815158 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.815177 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.917374 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.917432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.917446 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.917461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4762]: I0217 14:06:49.917471 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.020811 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.020865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.020876 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.020898 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.020911 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.064470 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:22:48.59628073 +0000 UTC Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.070136 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.070223 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:50 crc kubenswrapper[4762]: E0217 14:06:50.070375 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.070395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:50 crc kubenswrapper[4762]: E0217 14:06:50.070532 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:50 crc kubenswrapper[4762]: E0217 14:06:50.070678 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.089017 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.105691 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.119371 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.123622 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.123709 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.123726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.123751 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.123770 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.142029 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:32Z\\\",\\\"message\\\":\\\"7 14:06:32.956311 6840 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0217 14:06:32.956341 6840 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0217 14:06:32.956366 6840 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956384 6840 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956393 6840 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-xpj6v in node crc\\\\nI0217 14:06:32.956399 6840 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v after 0 failed attempt(s)\\\\nI0217 14:06:32.956405 6840 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956425 6840 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:06:32.956485 6840 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.155893 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.169826 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.188048 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.203049 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.214883 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.226088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.226128 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.226140 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.226156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.226168 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.240428 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.251891 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.264584 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.275832 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.286129 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.299215 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.307165 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.316458 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.327367 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.328578 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.328633 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.328681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.328698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.328710 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.336376 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.432300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.432351 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.432365 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.432385 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.432399 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.535936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.536004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.536021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.536044 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.536062 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.638899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.638960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.638970 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.638985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.638995 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.740708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.740763 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.740779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.740795 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.740805 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.843178 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.843222 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.843236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.843251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.843263 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.945056 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.945089 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.945097 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.945111 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4762]: I0217 14:06:50.945121 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.047269 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.047326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.047339 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.047361 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.047372 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.064607 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 22:14:55.544319013 +0000 UTC Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.070054 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:51 crc kubenswrapper[4762]: E0217 14:06:51.070425 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.150346 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.150415 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.150432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.150458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.150478 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.252770 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.253004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.253019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.253036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.253048 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.355369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.355400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.355408 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.355422 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.355430 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.458598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.458907 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.459016 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.459101 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.459187 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.562293 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.562413 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.562442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.562473 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.562496 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.665587 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.665698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.665724 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.665747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.665762 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.768607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.768679 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.768689 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.768707 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.768718 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.870686 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.870719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.870729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.870745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.870756 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.974033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.974085 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.974097 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.974118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4762]: I0217 14:06:51.974129 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.065017 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 04:20:49.934495811 +0000 UTC Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.070338 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.070381 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:52 crc kubenswrapper[4762]: E0217 14:06:52.070474 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.070518 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:52 crc kubenswrapper[4762]: E0217 14:06:52.070628 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:52 crc kubenswrapper[4762]: E0217 14:06:52.070772 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.078777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.078930 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.078941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.078962 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.079283 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.182041 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.182091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.182103 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.182119 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.182455 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.285597 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.285630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.285654 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.285669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.285679 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.388220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.388282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.388310 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.388326 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.388336 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.490594 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.490631 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.490655 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.490669 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.490678 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.593188 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.593268 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.593279 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.593296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.593306 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.695946 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.696200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.696299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.696394 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.696515 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.799463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.799518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.799533 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.799552 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.799566 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.902145 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.902183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.902197 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.902213 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4762]: I0217 14:06:52.902224 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.005185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.005209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.005216 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.005229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.005237 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.065312 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 01:01:27.944904795 +0000 UTC Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.070612 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:53 crc kubenswrapper[4762]: E0217 14:06:53.070741 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.108019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.108345 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.108514 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.108787 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.109004 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.211773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.211811 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.211822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.211850 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.211862 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.314854 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.314913 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.314932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.314949 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.314962 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.417419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.417471 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.417483 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.417504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.417517 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.521117 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.521194 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.521212 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.521236 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.521255 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.624765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.624821 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.624842 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.624871 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.624894 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.727558 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.727598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.727608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.727626 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.727638 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.830673 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.830717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.830728 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.830759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.830769 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.932820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.932851 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.932868 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.932884 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4762]: I0217 14:06:53.932894 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.036399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.036467 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.036494 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.036524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.036543 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.065676 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 08:51:10.889105072 +0000 UTC Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.069995 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:54 crc kubenswrapper[4762]: E0217 14:06:54.070125 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.070137 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.070009 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:54 crc kubenswrapper[4762]: E0217 14:06:54.070232 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:54 crc kubenswrapper[4762]: E0217 14:06:54.070344 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.138484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.138526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.138537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.138557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.138571 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.241688 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.241736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.241747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.241764 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.241776 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.343620 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.343739 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.343755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.343774 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.343788 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.445462 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.445518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.445532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.445558 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.445576 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.547516 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.547604 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.547621 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.547670 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.547704 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.650720 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.650760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.650769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.650784 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.650793 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.753603 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.753705 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.753730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.753759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.753780 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.857954 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.858397 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.858582 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.858828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.858992 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.961804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.961848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.961856 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.961870 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4762]: I0217 14:06:54.961881 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.064678 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.064738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.064753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.064772 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.064786 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.065850 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 21:25:18.002764566 +0000 UTC Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.070127 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:55 crc kubenswrapper[4762]: E0217 14:06:55.070232 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.167936 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.167985 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.167998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.168015 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.168028 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.270765 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.270802 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.270812 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.270827 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.270837 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.373479 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.373536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.373544 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.373557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.373566 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.477534 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.477571 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.477580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.477595 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.477607 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.580164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.580209 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.580219 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.580235 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.580246 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.682934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.683010 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.683036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.683071 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.683097 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.784929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.784963 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.784972 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.784986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.784996 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.887806 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.887843 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.887851 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.887865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.887874 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.990703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.990773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.990796 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.990834 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4762]: I0217 14:06:55.990857 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.066369 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 05:35:05.787474423 +0000 UTC Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.072870 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.073018 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:56 crc kubenswrapper[4762]: E0217 14:06:56.073290 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.073333 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:56 crc kubenswrapper[4762]: E0217 14:06:56.073478 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:56 crc kubenswrapper[4762]: E0217 14:06:56.073591 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.093290 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.093498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.093569 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.093655 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.093724 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.195875 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.196121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.196234 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.196322 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.196381 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.299147 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.299568 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.299698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.299785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.299933 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.402583 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.402859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.402982 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.403070 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.403163 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.505401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.505443 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.505455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.505470 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.505481 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.608611 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.608971 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.609067 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.609164 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.609254 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.712294 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.712344 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.712359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.712380 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.712395 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.814979 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.815229 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.815336 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.815431 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.815505 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.918019 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.918349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.918501 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.918672 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4762]: I0217 14:06:56.918824 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.020995 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.021027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.021036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.021049 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.021058 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.067026 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 07:21:31.995036482 +0000 UTC Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.070309 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:57 crc kubenswrapper[4762]: E0217 14:06:57.070413 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.123791 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.123848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.123866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.123885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.123897 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.226416 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.226535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.226575 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.226592 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.226602 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.329675 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.329750 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.329769 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.329791 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.329810 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.431607 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.431663 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.431674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.431688 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.431697 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.534136 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.534611 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.534893 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.535434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.535875 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.642710 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.643755 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.643906 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.644371 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.644503 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.747618 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.747699 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.747709 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.747733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.747745 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.851519 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.852059 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.852325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.852865 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.853344 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.956876 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.956952 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.956971 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.957001 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4762]: I0217 14:06:57.957021 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.060519 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.060591 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.060610 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.060638 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.060692 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.067773 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 07:27:43.434074401 +0000 UTC Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.070910 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.071130 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.071327 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.071406 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.072010 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.072081 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.072361 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.072614 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.163721 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.163777 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.163800 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.163829 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.163851 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.259716 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.259756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.259767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.259782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.259792 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.271361 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:58Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.274708 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.274747 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.274759 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.274779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.274792 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.290022 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:58Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.293895 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.294113 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.294251 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.294376 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.294466 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.308997 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:58Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.312354 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.312596 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.312726 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.312822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.312895 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.323720 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:58Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.327387 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.327567 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.327660 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.327773 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.327864 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.339322 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0948f442-754f-492a-b255-7c21a6e922d3\\\",\\\"systemUUID\\\":\\\"f4e79948-4d35-4f10-94ee-0c0db8bd23cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:58Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.339796 4762 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.341409 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.341458 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.341474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.341497 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.341513 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.352138 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.352312 4762 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:58 crc kubenswrapper[4762]: E0217 14:06:58.352372 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs podName:63580a98-4d0e-434e-ad09-e7d542e7a5cc nodeName:}" failed. No retries permitted until 2026-02-17 14:08:02.352357925 +0000 UTC m=+162.932358567 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs") pod "network-metrics-daemon-7v8bf" (UID: "63580a98-4d0e-434e-ad09-e7d542e7a5cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.444255 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.444301 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.444312 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.444329 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.444342 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.546711 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.546744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.546753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.546766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.546775 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.649900 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.649941 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.649950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.649965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.649975 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.751837 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.752489 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.752531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.752557 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.752570 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.855532 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.855572 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.855580 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.855594 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.855603 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.957562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.957616 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.957631 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.957681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4762]: I0217 14:06:58.957695 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.060259 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.060300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.060308 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.060325 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.060336 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.068681 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 05:34:16.441478255 +0000 UTC Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.070080 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:59 crc kubenswrapper[4762]: E0217 14:06:59.070289 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.162367 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.162421 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.162435 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.162455 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.162471 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.264919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.264978 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.264987 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.265003 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.265015 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.368210 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.368253 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.368264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.368281 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.368295 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.471224 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.471266 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.471278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.471296 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.471308 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.574931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.574980 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.574992 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.575010 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.575023 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.677333 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.677394 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.677406 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.677419 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.677431 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.780132 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.780174 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.780185 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.780201 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.780212 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.881998 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.882037 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.882048 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.882063 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.882073 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.985077 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.985176 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.985193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.985214 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4762]: I0217 14:06:59.985230 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.069142 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:47:49.922474806 +0000 UTC Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.070429 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.070524 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:00 crc kubenswrapper[4762]: E0217 14:07:00.070580 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:00 crc kubenswrapper[4762]: E0217 14:07:00.070764 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.071317 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:00 crc kubenswrapper[4762]: E0217 14:07:00.071434 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.085548 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18a966ae-76bd-4298-9964-8be5f5b1dc95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7e01d292ae71010507c11cd6fb5d62e1c05231657fb70a3b7d0c8fd4cd50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://691b9e1f1ab34c981df8a1d89e6821bc631c466a70f868b684a8306341664c2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cc739853e83b6e24981081f69b8331cff1d347e9a3faae6ff157ded9c493fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d2d316b0e05155728b7865aa18fca830b333618cd763bf883b85d652c8bc316\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d91429b822d80a00059bb8fc3a140ed48792206a8d76c22bba3575e930cf564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09e3769d3cf9f7263b129ad84f8e6857b378efdf9df887d02c75fa44207f19ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd8744db44c59765c90b2596d9a231e572c2d83df90ba940cbaf6655037d530\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqlz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xpj6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.087921 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.087969 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.087992 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.088021 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.088042 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.096943 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e09fb0e-eba7-42d8-a0d3-4ba58b5a7d03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3c19edbdef93769a6929de6ef0b9ba4b29a8b51717408ecdf1f7947f7ec830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ab4c77da70e7710b967b364168154aa79b7d9f1ffb45289c0e02d4fc62100e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fadfb7cfdbe8bc6f161dad84961ead21baa2abe0785a1c516af60fb46c5ef7ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.108450 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fa4635-2b21-44d6-b938-90dda191b9a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://349155c7dadc281ae1a82d565109b6907193d193e323cf9d786fc114c48d035f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fa727077b8828778e67875547a2396323944811bdded7eb35a110de4b6aec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.126576 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1db87bb-5bfe-4834-bfcf-ff26390eda1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72e7632d5143f0f0a182f5af87888bd641dd185f2bb982fefb1de5745bbfc46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33eb6a1416cc6057650b34e76f6f34cf6ca5ef2bb1920528659f80b8968a4d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3071d8f03c0a38a6670068113c4d8063da23f9dce026503a622142c199d63dfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c383cddbf2dba439365c9af28bb5276c676ec360737d3767c58467c608f74a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de47a78e41b2ecbf7b0c06dfa4d1ae064157099e38ed7b3ef12f5dca60f4c522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263d83f782e55d76843887a544f469636a25bea029447e2981c6fd30275f6172\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://790ef960b9cc550cda3e175fde3b2c1051303e7767d17003240a7442a3cf0593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba0ea8a07383f004090762dee7d715ca22f5386c467f8930cc464dd5fe38c15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.143485 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34cc9c9e67c0e0cd54db4acf22059e115ace903eb4e315a30a13ae567d1d79b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.158891 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.172702 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4r7p8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1057884-d2c5-4911-9b97-fb4fedba9ab1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:28Z\\\",\\\"message\\\":\\\"2026-02-17T14:05:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089\\\\n2026-02-17T14:05:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9bf4db2a-691d-4712-9438-fa425e317089 to /host/opt/cni/bin/\\\\n2026-02-17T14:05:43Z [verbose] multus-daemon started\\\\n2026-02-17T14:05:43Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:06:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g987m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4r7p8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.184041 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63580a98-4d0e-434e-ad09-e7d542e7a5cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr2nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7v8bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.189888 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.189933 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.189943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.189959 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.189970 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.197190 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8507903e-806f-4e57-bb1e-d218465a9ea3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:05:39Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:05:33.559766 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:05:33.561511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3945850201/tls.crt::/tmp/serving-cert-3945850201/tls.key\\\\\\\"\\\\nI0217 14:05:39.357353 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:05:39.394244 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:05:39.394282 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:05:39.394316 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:05:39.394325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:05:39.410195 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 14:05:39.410216 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 14:05:39.410225 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:05:39.410237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:05:39.410241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:05:39.410244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:05:39.410247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 14:05:39.415801 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.210052 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.220095 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.227951 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s25qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cba5d7d1-c9f6-4012-9380-9abc9449564c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f4174a3acb8fdf0f198a78089269945436e5cd8db693a511b9ec1b4c0c7fb9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58hgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s25qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.239192 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ce1a18-b3b7-40b6-83df-b76ba4fbb232\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bed8e861562f3f4a3b7f5f5f4e5d0c0bb967c52e5c4f4194a2523ab0f51d13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82ea868fa6e75cab3be62095cc9dda7ff43e8ff72d354b56ea22fe84da4bd4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52dcfefdd0138ca17d1c8afc24f0c52b6cbfc51cd089ce6f8069466bcc3110fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a809f511bc5e43a76dce576aebfda78da0065bc6952987e42bfd17becc8ada01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.251091 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6a8e982ef66ce759982ae4bd5b4ac9a0d650aabd393057b1a84c2caf499bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.261723 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb11ce5-3ff7-4743-a879-95285dae2998\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a21d51090aff327279ed7a4f7405c397e5170bb2a9056ec34055fac66a55c4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq6hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rwhnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.279402 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab134be0-88ef-45ac-80e0-963a60169ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:32Z\\\",\\\"message\\\":\\\"7 14:06:32.956311 6840 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0217 14:06:32.956341 6840 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0217 14:06:32.956366 6840 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956384 6840 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956393 6840 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-xpj6v in node crc\\\\nI0217 14:06:32.956399 6840 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-xpj6v after 0 failed attempt(s)\\\\nI0217 14:06:32.956405 6840 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-xpj6v\\\\nI0217 14:06:32.956425 6840 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:06:32.956485 6840 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8m5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.292512 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.292570 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.292585 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.292608 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.292629 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.293218 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff6bdd02eebfe288887810cdfc34542dd8d2ba9b0b68c44f6528d6a14800dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2eac2d9478fdba70233b9e208149156d7ee00bc57785fff0b3f748805bc4691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.304457 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-76htw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3db634-a0f8-46b2-b54f-a12a054aa004\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e597eef5a83345ac9f03ff681b1c6bd2f32c811a19f4f29c92636b0f0acb565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pw5l6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-76htw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.316786 4762 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22fa85ee-f73c-44a4-97e9-660bdf0a07f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:05:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d84b996fe9f6caeec14c4abb38d46f09a24c8934212160409d1b1ed92d42d965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://553f09024456b14c3bee4fba70e95a44e5e1b83f9ae37061d74ba8b04ba753f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:05:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvdfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:05:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dw82d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:00Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.396039 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.396092 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.396103 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.396121 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.396152 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.499045 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.499100 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.499118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.499137 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.499152 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.600848 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.600919 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.600934 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.600950 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.600961 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.703384 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.703417 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.703434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.703449 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.703459 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.805183 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.805244 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.805256 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.805275 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.805289 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.907190 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.907246 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.907257 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.907278 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4762]: I0217 14:07:00.907291 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.009524 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.009578 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.009600 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.009630 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.009683 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.069864 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 12:50:41.176160943 +0000 UTC Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.070083 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:01 crc kubenswrapper[4762]: E0217 14:07:01.070249 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.112702 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.112767 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.112785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.112809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.112826 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.215452 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.215505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.215521 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.215536 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.215547 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.317912 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.317965 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.317981 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.318033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.318050 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.420749 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.420785 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.420794 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.420809 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.420819 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.523193 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.523240 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.523250 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.523268 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.523279 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.626400 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.626460 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.626474 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.626495 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.626511 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.729864 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.729909 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.729920 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.729937 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.729950 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.833499 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.833586 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.833612 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.833681 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.833708 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.936565 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.936634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.936700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.936733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4762]: I0217 14:07:01.936759 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.038861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.038902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.038914 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.038931 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.038944 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.070392 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:51:25.019323058 +0000 UTC Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.070517 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.070579 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:02 crc kubenswrapper[4762]: E0217 14:07:02.070611 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:02 crc kubenswrapper[4762]: E0217 14:07:02.070787 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.070877 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:02 crc kubenswrapper[4762]: E0217 14:07:02.071130 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.142272 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.142329 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.142347 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.142372 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.142394 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.245526 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.245598 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.245617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.245671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.245690 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.348619 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.348721 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.348740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.348766 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.348784 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.451869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.451956 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.451983 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.452014 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.452038 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.554355 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.554442 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.554476 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.554515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.554537 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.657399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.657501 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.657517 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.657540 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.657557 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.760200 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.760263 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.760273 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.760291 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.760303 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.862499 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.862550 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.862560 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.862578 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.862590 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.965506 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.965617 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.965634 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.965682 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4762]: I0217 14:07:02.965701 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.068064 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.068108 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.068129 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.068144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.068154 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.070459 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.070485 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 23:52:42.283453892 +0000 UTC Feb 17 14:07:03 crc kubenswrapper[4762]: E0217 14:07:03.070608 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.171075 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.171146 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.171168 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.171197 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.171219 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.273946 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.273986 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.273994 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.274006 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.274015 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.376588 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.376629 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.376661 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.376696 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.376708 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.479684 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.479725 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.479736 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.479752 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.479762 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.582344 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.582381 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.582392 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.582407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.582418 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.684989 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.685027 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.685036 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.685052 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.685063 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.787859 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.787902 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.787911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.787924 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.787934 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.889791 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.889849 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.889866 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.889927 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.889946 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.992464 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.992506 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.992515 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.992531 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4762]: I0217 14:07:03.992541 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.070717 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 12:53:57.016632807 +0000 UTC Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.070893 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.070979 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:04 crc kubenswrapper[4762]: E0217 14:07:04.071115 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.071130 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:04 crc kubenswrapper[4762]: E0217 14:07:04.071230 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:04 crc kubenswrapper[4762]: E0217 14:07:04.071356 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.094434 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.094472 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.094481 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.094496 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.094507 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.196743 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.196808 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.196818 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.196839 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.196851 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.299102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.299173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.299192 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.299220 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.299239 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.402694 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.402744 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.402758 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.402782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.402798 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.506797 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.506860 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.506872 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.506892 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.506905 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.609671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.609763 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.609774 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.609793 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.609805 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.712977 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.713023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.713033 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.713050 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.713063 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.816023 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.816105 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.816118 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.816135 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.816146 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.918390 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.918432 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.918461 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.918476 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4762]: I0217 14:07:04.918485 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.020780 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.020822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.020844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.020861 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.020874 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.070439 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:05 crc kubenswrapper[4762]: E0217 14:07:05.070578 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.071437 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:29:45.584948363 +0000 UTC Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.122885 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.122932 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.122943 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.122960 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.122969 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.225032 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.225088 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.225104 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.225127 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.225143 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.328309 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.328358 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.328369 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.328388 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.328400 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.431733 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.431807 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.431828 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.431857 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.431873 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.535445 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.535504 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.535520 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.535538 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.535552 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.639401 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.639507 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.639593 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.639622 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.639699 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.742447 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.742503 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.742518 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.742535 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.742549 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.844589 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.844666 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.844679 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.844697 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.844710 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.946756 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.946820 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.946838 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.946860 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4762]: I0217 14:07:05.946876 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.048674 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.048721 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.048729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.048746 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.048756 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.070330 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.070361 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:06 crc kubenswrapper[4762]: E0217 14:07:06.070539 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.070567 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:06 crc kubenswrapper[4762]: E0217 14:07:06.070836 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:06 crc kubenswrapper[4762]: E0217 14:07:06.071033 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.072508 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 17:20:26.597351494 +0000 UTC Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.150854 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.150899 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.150911 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.150929 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.150940 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.253217 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.253264 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.253282 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.253301 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.253313 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.355356 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.355399 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.355407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.355424 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.355433 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.457687 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.457719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.457729 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.457745 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.457756 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.560300 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.560349 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.560359 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.560375 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.560386 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.662615 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.662685 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.662700 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.662717 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.662730 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.765074 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.765184 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.765194 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.765207 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.765217 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.867975 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.868004 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.868013 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.868028 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.868038 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.969844 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.969869 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.969877 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.969889 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4762]: I0217 14:07:06.969897 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.070197 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:07 crc kubenswrapper[4762]: E0217 14:07:07.070299 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.071753 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.071779 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.071790 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.071804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.071816 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.073391 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:02:27.802271697 +0000 UTC Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.173863 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.173897 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.173905 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.173918 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.173926 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.275775 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.275813 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.275822 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.275836 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.275848 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.378232 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.378274 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.378283 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.378299 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.378310 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.481671 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.481713 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.481723 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.481738 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.481747 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.585562 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.585698 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.585730 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.585754 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.585771 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.688328 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.688381 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.688391 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.688407 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.688421 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.790463 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.790505 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.790522 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.790537 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.790548 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.893091 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.893134 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.893144 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.893157 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.893168 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.995711 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.995760 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.995768 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.995782 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4762]: I0217 14:07:07.995794 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.070397 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.070456 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.070414 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:08 crc kubenswrapper[4762]: E0217 14:07:08.070596 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:08 crc kubenswrapper[4762]: E0217 14:07:08.070687 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:08 crc kubenswrapper[4762]: E0217 14:07:08.070803 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.073841 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 19:00:40.203943055 +0000 UTC Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.097718 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.097788 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.097804 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.097826 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.097842 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.201428 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.201475 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.201484 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.201498 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.201511 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.304102 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.304148 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.304156 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.304173 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.304184 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.353629 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.353703 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.353719 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.353740 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.353759 4762 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.412420 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb"] Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.412841 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.415114 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.415131 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.415539 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.415757 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.456486 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4r7p8" podStartSLOduration=88.456460988 podStartE2EDuration="1m28.456460988s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.443895235 +0000 UTC m=+109.023895897" watchObservedRunningTime="2026-02-17 14:07:08.456460988 +0000 UTC m=+109.036461660" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.470831 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.470808099 podStartE2EDuration="1m28.470808099s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.470676756 +0000 UTC m=+109.050677428" watchObservedRunningTime="2026-02-17 14:07:08.470808099 +0000 UTC m=+109.050808751" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.502195 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s25qb" podStartSLOduration=89.502176815 podStartE2EDuration="1m29.502176815s" podCreationTimestamp="2026-02-17 14:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.501806325 +0000 UTC m=+109.081806977" watchObservedRunningTime="2026-02-17 14:07:08.502176815 +0000 UTC m=+109.082177467" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.528467 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.528450772 podStartE2EDuration="54.528450772s" podCreationTimestamp="2026-02-17 14:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.52800904 +0000 UTC m=+109.108009702" watchObservedRunningTime="2026-02-17 14:07:08.528450772 +0000 UTC m=+109.108451424" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.552187 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podStartSLOduration=88.552167988 podStartE2EDuration="1m28.552167988s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.551750577 +0000 UTC m=+109.131751219" watchObservedRunningTime="2026-02-17 14:07:08.552167988 +0000 UTC m=+109.132168640" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.559227 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ff9701fb-96f4-4826-a3db-b058b969df02-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.559270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9701fb-96f4-4826-a3db-b058b969df02-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.559290 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9701fb-96f4-4826-a3db-b058b969df02-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.559319 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff9701fb-96f4-4826-a3db-b058b969df02-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.559336 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ff9701fb-96f4-4826-a3db-b058b969df02-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.601895 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-76htw" podStartSLOduration=88.601878134 podStartE2EDuration="1m28.601878134s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.601458073 +0000 UTC m=+109.181458725" watchObservedRunningTime="2026-02-17 14:07:08.601878134 +0000 UTC m=+109.181878786" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.610401 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dw82d" podStartSLOduration=88.610385816 podStartE2EDuration="1m28.610385816s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.610015766 +0000 UTC m=+109.190016418" watchObservedRunningTime="2026-02-17 14:07:08.610385816 +0000 UTC m=+109.190386468" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.626730 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xpj6v" podStartSLOduration=88.626716152 podStartE2EDuration="1m28.626716152s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.626155156 +0000 UTC m=+109.206155808" watchObservedRunningTime="2026-02-17 14:07:08.626716152 +0000 UTC m=+109.206716804" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.640163 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.640139058 podStartE2EDuration="1m22.640139058s" podCreationTimestamp="2026-02-17 14:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.63912596 +0000 UTC m=+109.219126612" watchObservedRunningTime="2026-02-17 14:07:08.640139058 +0000 UTC m=+109.220139720" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.648082 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=40.648067533 podStartE2EDuration="40.648067533s" podCreationTimestamp="2026-02-17 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.648058473 +0000 UTC m=+109.228059125" watchObservedRunningTime="2026-02-17 14:07:08.648067533 +0000 UTC m=+109.228068185" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660399 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff9701fb-96f4-4826-a3db-b058b969df02-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660452 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ff9701fb-96f4-4826-a3db-b058b969df02-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660514 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ff9701fb-96f4-4826-a3db-b058b969df02-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9701fb-96f4-4826-a3db-b058b969df02-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660569 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9701fb-96f4-4826-a3db-b058b969df02-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ff9701fb-96f4-4826-a3db-b058b969df02-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.660635 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ff9701fb-96f4-4826-a3db-b058b969df02-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.662009 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff9701fb-96f4-4826-a3db-b058b969df02-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.673494 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9701fb-96f4-4826-a3db-b058b969df02-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.674321 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.674303589 podStartE2EDuration="1m28.674303589s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:08.673444445 +0000 UTC m=+109.253445127" watchObservedRunningTime="2026-02-17 14:07:08.674303589 +0000 UTC m=+109.254304241" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.679170 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9701fb-96f4-4826-a3db-b058b969df02-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vxvhb\" (UID: \"ff9701fb-96f4-4826-a3db-b058b969df02\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:08 crc kubenswrapper[4762]: I0217 14:07:08.727824 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.070134 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:09 crc kubenswrapper[4762]: E0217 14:07:09.070806 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.071009 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:07:09 crc kubenswrapper[4762]: E0217 14:07:09.071204 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vksr_openshift-ovn-kubernetes(ab134be0-88ef-45ac-80e0-963a60169ad2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.074240 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:21:56.663399109 +0000 UTC Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.074293 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.082557 4762 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.525022 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" event={"ID":"ff9701fb-96f4-4826-a3db-b058b969df02","Type":"ContainerStarted","Data":"3e967ac66d52a6f257e9202db375864286090620f47ceca3ef9f19b021917366"} Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.525100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" event={"ID":"ff9701fb-96f4-4826-a3db-b058b969df02","Type":"ContainerStarted","Data":"a4fa6f2cbfe3fa4727ed1ac1380fce4712b7e37ae05fd139610813e483c1ad60"} Feb 17 14:07:09 crc kubenswrapper[4762]: I0217 14:07:09.544079 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vxvhb" podStartSLOduration=89.54404971 podStartE2EDuration="1m29.54404971s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:09.54187458 +0000 UTC m=+110.121875242" watchObservedRunningTime="2026-02-17 14:07:09.54404971 +0000 UTC m=+110.124050362" Feb 17 14:07:10 crc kubenswrapper[4762]: I0217 14:07:10.070843 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:10 crc kubenswrapper[4762]: I0217 14:07:10.070843 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:10 crc kubenswrapper[4762]: I0217 14:07:10.071237 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:10 crc kubenswrapper[4762]: E0217 14:07:10.072552 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:10 crc kubenswrapper[4762]: E0217 14:07:10.072724 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:10 crc kubenswrapper[4762]: E0217 14:07:10.072782 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:11 crc kubenswrapper[4762]: I0217 14:07:11.070155 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:11 crc kubenswrapper[4762]: E0217 14:07:11.070339 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:12 crc kubenswrapper[4762]: I0217 14:07:12.071447 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:12 crc kubenswrapper[4762]: E0217 14:07:12.071579 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:12 crc kubenswrapper[4762]: I0217 14:07:12.072433 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:12 crc kubenswrapper[4762]: E0217 14:07:12.072537 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:12 crc kubenswrapper[4762]: I0217 14:07:12.072849 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:12 crc kubenswrapper[4762]: E0217 14:07:12.072962 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:13 crc kubenswrapper[4762]: I0217 14:07:13.070036 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:13 crc kubenswrapper[4762]: E0217 14:07:13.070163 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.070575 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:14 crc kubenswrapper[4762]: E0217 14:07:14.070727 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.070906 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:14 crc kubenswrapper[4762]: E0217 14:07:14.070954 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.071155 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:14 crc kubenswrapper[4762]: E0217 14:07:14.071204 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.541660 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/1.log" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.542504 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/0.log" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.542567 4762 generic.go:334] "Generic (PLEG): container finished" podID="c1057884-d2c5-4911-9b97-fb4fedba9ab1" containerID="97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3" exitCode=1 Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.542596 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerDied","Data":"97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3"} Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.542630 4762 scope.go:117] "RemoveContainer" containerID="1faefd35289d87b8e7efa991c6d44b78d997adf04f682c5b9c3983133124331f" Feb 17 14:07:14 crc kubenswrapper[4762]: I0217 14:07:14.542966 4762 scope.go:117] "RemoveContainer" containerID="97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3" Feb 17 14:07:14 crc kubenswrapper[4762]: E0217 14:07:14.543105 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-4r7p8_openshift-multus(c1057884-d2c5-4911-9b97-fb4fedba9ab1)\"" pod="openshift-multus/multus-4r7p8" podUID="c1057884-d2c5-4911-9b97-fb4fedba9ab1" Feb 17 14:07:15 crc kubenswrapper[4762]: I0217 14:07:15.070596 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:15 crc kubenswrapper[4762]: E0217 14:07:15.070791 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:15 crc kubenswrapper[4762]: I0217 14:07:15.546819 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/1.log" Feb 17 14:07:16 crc kubenswrapper[4762]: I0217 14:07:16.070853 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:16 crc kubenswrapper[4762]: I0217 14:07:16.070879 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:16 crc kubenswrapper[4762]: I0217 14:07:16.070908 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:16 crc kubenswrapper[4762]: E0217 14:07:16.071301 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:16 crc kubenswrapper[4762]: E0217 14:07:16.071423 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:16 crc kubenswrapper[4762]: E0217 14:07:16.071515 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:17 crc kubenswrapper[4762]: I0217 14:07:17.070782 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:17 crc kubenswrapper[4762]: E0217 14:07:17.070916 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:18 crc kubenswrapper[4762]: I0217 14:07:18.070134 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:18 crc kubenswrapper[4762]: I0217 14:07:18.070205 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:18 crc kubenswrapper[4762]: I0217 14:07:18.070284 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:18 crc kubenswrapper[4762]: E0217 14:07:18.070293 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:18 crc kubenswrapper[4762]: E0217 14:07:18.070427 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:18 crc kubenswrapper[4762]: E0217 14:07:18.070583 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:19 crc kubenswrapper[4762]: I0217 14:07:19.070208 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:19 crc kubenswrapper[4762]: E0217 14:07:19.070317 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:20 crc kubenswrapper[4762]: I0217 14:07:20.070004 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:20 crc kubenswrapper[4762]: I0217 14:07:20.070370 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:20 crc kubenswrapper[4762]: I0217 14:07:20.069991 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:20 crc kubenswrapper[4762]: E0217 14:07:20.071747 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:20 crc kubenswrapper[4762]: E0217 14:07:20.071832 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:20 crc kubenswrapper[4762]: E0217 14:07:20.071924 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:20 crc kubenswrapper[4762]: E0217 14:07:20.116180 4762 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 14:07:20 crc kubenswrapper[4762]: E0217 14:07:20.323291 4762 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:07:21 crc kubenswrapper[4762]: I0217 14:07:21.070249 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:21 crc kubenswrapper[4762]: E0217 14:07:21.070602 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:22 crc kubenswrapper[4762]: I0217 14:07:22.070272 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:22 crc kubenswrapper[4762]: I0217 14:07:22.070474 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:22 crc kubenswrapper[4762]: I0217 14:07:22.070545 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:22 crc kubenswrapper[4762]: E0217 14:07:22.070607 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:22 crc kubenswrapper[4762]: E0217 14:07:22.070681 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:22 crc kubenswrapper[4762]: E0217 14:07:22.070713 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.070123 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:23 crc kubenswrapper[4762]: E0217 14:07:23.070457 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.070936 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.573363 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/3.log" Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.576172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerStarted","Data":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.576702 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.975575 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podStartSLOduration=103.975554808 podStartE2EDuration="1m43.975554808s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:23.609619018 +0000 UTC m=+124.189619670" watchObservedRunningTime="2026-02-17 14:07:23.975554808 +0000 UTC m=+124.555555460" Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.976498 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7v8bf"] Feb 17 14:07:23 crc kubenswrapper[4762]: I0217 14:07:23.976609 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:23 crc kubenswrapper[4762]: E0217 14:07:23.976708 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:24 crc kubenswrapper[4762]: I0217 14:07:24.070378 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:24 crc kubenswrapper[4762]: I0217 14:07:24.070425 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:24 crc kubenswrapper[4762]: E0217 14:07:24.070618 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:24 crc kubenswrapper[4762]: E0217 14:07:24.070816 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:25 crc kubenswrapper[4762]: I0217 14:07:25.070673 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:25 crc kubenswrapper[4762]: E0217 14:07:25.070793 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:25 crc kubenswrapper[4762]: E0217 14:07:25.324568 4762 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:07:26 crc kubenswrapper[4762]: I0217 14:07:26.070391 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:26 crc kubenswrapper[4762]: E0217 14:07:26.070882 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:26 crc kubenswrapper[4762]: I0217 14:07:26.070558 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:26 crc kubenswrapper[4762]: E0217 14:07:26.070973 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:26 crc kubenswrapper[4762]: I0217 14:07:26.070494 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:26 crc kubenswrapper[4762]: E0217 14:07:26.071482 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:27 crc kubenswrapper[4762]: I0217 14:07:27.070018 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:27 crc kubenswrapper[4762]: E0217 14:07:27.070169 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:28 crc kubenswrapper[4762]: I0217 14:07:28.070365 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:28 crc kubenswrapper[4762]: I0217 14:07:28.070494 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:28 crc kubenswrapper[4762]: E0217 14:07:28.070775 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:28 crc kubenswrapper[4762]: I0217 14:07:28.070856 4762 scope.go:117] "RemoveContainer" containerID="97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3" Feb 17 14:07:28 crc kubenswrapper[4762]: I0217 14:07:28.070923 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:28 crc kubenswrapper[4762]: E0217 14:07:28.071087 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:28 crc kubenswrapper[4762]: E0217 14:07:28.071419 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:28 crc kubenswrapper[4762]: I0217 14:07:28.592772 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/1.log" Feb 17 14:07:28 crc kubenswrapper[4762]: I0217 14:07:28.592891 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerStarted","Data":"2180feb9a7871567c44d5f79b87d557e3bcdb1bc5b223e164d5df42091fc7302"} Feb 17 14:07:29 crc kubenswrapper[4762]: I0217 14:07:29.070222 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:29 crc kubenswrapper[4762]: E0217 14:07:29.070413 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:30 crc kubenswrapper[4762]: I0217 14:07:30.070854 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:30 crc kubenswrapper[4762]: I0217 14:07:30.070923 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:30 crc kubenswrapper[4762]: I0217 14:07:30.070866 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:30 crc kubenswrapper[4762]: E0217 14:07:30.073776 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:30 crc kubenswrapper[4762]: E0217 14:07:30.073875 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v8bf" podUID="63580a98-4d0e-434e-ad09-e7d542e7a5cc" Feb 17 14:07:30 crc kubenswrapper[4762]: E0217 14:07:30.073976 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:31 crc kubenswrapper[4762]: I0217 14:07:31.070382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:31 crc kubenswrapper[4762]: I0217 14:07:31.072805 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 14:07:31 crc kubenswrapper[4762]: I0217 14:07:31.072872 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.070811 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.070827 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.070883 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.073682 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.073692 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.073998 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 14:07:32 crc kubenswrapper[4762]: I0217 14:07:32.075100 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 14:07:34 crc kubenswrapper[4762]: I0217 14:07:34.313190 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.092732 4762 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.123829 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.124269 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.127297 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.127350 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.127547 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.127573 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.129731 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.131070 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-58fnv"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.131511 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.132917 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.135959 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.137689 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.138141 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.138288 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wpkmz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.138320 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.138614 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.138771 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.138962 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.139180 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.139594 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.140697 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rjv84"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.141114 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.142663 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.143057 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8wzgg"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.143347 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.143691 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.144563 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.144896 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.145058 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.146519 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.146533 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.146555 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.146526 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.146800 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.148588 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.149101 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.149864 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.150561 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-54mm8"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.151116 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.153509 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fqmtz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.154176 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.154296 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.154379 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.154488 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.154516 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.154627 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.155164 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.155350 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.155386 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.155616 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.162028 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.163004 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.163274 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.163931 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.163946 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.164235 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.165036 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.176549 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.176712 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.176797 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-fc6hb"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.177228 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.177497 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.177616 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.179532 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.179586 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.179784 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180068 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180158 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q8w48"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180285 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180404 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180470 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180535 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.180413 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181001 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181175 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181329 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181420 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181537 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181669 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.181944 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.182064 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.182175 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.186331 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.186805 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.187174 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.188749 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.188896 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.188994 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189081 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189189 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189312 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189391 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189499 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189589 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189718 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.189888 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190061 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190139 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190193 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190208 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190316 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190415 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190519 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.190072 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.192722 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lpmkg"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.193161 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.193410 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.193818 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9878n"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.194328 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.194573 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.196274 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.196351 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-s9l2w"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.196890 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.197179 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.197625 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.197824 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.198604 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.199966 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.198677 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-client-ca\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204272 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204330 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57a8269-657e-49f2-8edb-189e9f69f1b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204397 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-config\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204419 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-config\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204448 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-serving-cert\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48v72\" (UniqueName: \"kubernetes.io/projected/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-kube-api-access-48v72\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204533 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-client-ca\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjr97\" (UniqueName: \"kubernetes.io/projected/a57a8269-657e-49f2-8edb-189e9f69f1b4-kube-api-access-qjr97\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204843 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.204839 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.205984 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.205984 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.206101 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.223855 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.227384 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.228246 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.228265 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.228759 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.228940 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.229071 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.229454 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.229452 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.230164 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.230724 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.230948 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.231060 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.231212 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.231382 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.231577 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.231792 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.231893 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.232025 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.232899 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qh6th"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.264233 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.264512 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.264972 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.265702 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.266540 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.266656 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.267142 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.275741 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.276457 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-phpw5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.276574 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.277009 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lm4gz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.277303 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.277676 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.278032 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.278668 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.284061 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.284278 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.284422 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.294940 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.303715 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xxdg7"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305419 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/47a2ded9-7d7e-48b5-b45c-d4adcebc60c1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g7x76\" (UID: \"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305457 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whfh\" (UniqueName: \"kubernetes.io/projected/594d6206-b063-4d47-b936-027624c9aa1f-kube-api-access-5whfh\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-encryption-config\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305504 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcj7x\" (UniqueName: \"kubernetes.io/projected/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-kube-api-access-dcj7x\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305521 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594d6206-b063-4d47-b936-027624c9aa1f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305537 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57a8269-657e-49f2-8edb-189e9f69f1b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305552 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-metrics-certs\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305567 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-node-pullsecrets\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305581 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-config\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305603 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-config\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305617 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-etcd-serving-ca\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305633 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-trusted-ca-bundle\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305662 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsnhw\" (UniqueName: \"kubernetes.io/projected/4c562cce-90d4-4d8e-a172-9b29678930a6-kube-api-access-wsnhw\") pod \"cluster-samples-operator-665b6dd947-92nvq\" (UID: \"4c562cce-90d4-4d8e-a172-9b29678930a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvtb\" (UniqueName: \"kubernetes.io/projected/9ea675be-b02f-49aa-a817-c50252ba1aed-kube-api-access-9hvtb\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305696 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-config\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305712 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305729 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-machine-approver-tls\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305746 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b826bc6-e50e-4b2c-8737-254c6d743ad8-images\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305781 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96f4e27-3174-43aa-9297-5a7e22094309-serving-cert\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305797 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-serving-cert\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305812 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48v72\" (UniqueName: \"kubernetes.io/projected/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-kube-api-access-48v72\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305829 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b826bc6-e50e-4b2c-8737-254c6d743ad8-config\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305846 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b826bc6-e50e-4b2c-8737-254c6d743ad8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305859 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-encryption-config\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305873 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff39058f-4aad-4477-aa68-0550cd30c2fc-audit-dir\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305888 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d3afdb-1d6c-41bb-9203-e2a23a82726e-metrics-tls\") pod \"dns-operator-744455d44c-lpmkg\" (UID: \"f1d3afdb-1d6c-41bb-9203-e2a23a82726e\") " pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305901 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6xb\" (UniqueName: \"kubernetes.io/projected/f1d3afdb-1d6c-41bb-9203-e2a23a82726e-kube-api-access-9f6xb\") pod \"dns-operator-744455d44c-lpmkg\" (UID: \"f1d3afdb-1d6c-41bb-9203-e2a23a82726e\") " pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305916 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qxv\" (UniqueName: \"kubernetes.io/projected/47a2ded9-7d7e-48b5-b45c-d4adcebc60c1-kube-api-access-b8qxv\") pod \"control-plane-machine-set-operator-78cbb6b69f-g7x76\" (UID: \"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305929 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-config\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305943 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-client\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305959 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zb2\" (UniqueName: \"kubernetes.io/projected/151149d5-152a-49f8-8c5f-453e68dc4bf5-kube-api-access-g7zb2\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305973 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af4696bf-1ed2-418e-9ff3-478d161d4053-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.305988 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-config\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306023 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee138e67-5a9e-4e1c-a2d0-58223b44451f-serving-cert\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-audit-dir\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306054 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-config\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306075 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bb5f7d28-9379-41a1-8e43-048ce98115f2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-service-ca\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306107 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-client-ca\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjr97\" (UniqueName: \"kubernetes.io/projected/a57a8269-657e-49f2-8edb-189e9f69f1b4-kube-api-access-qjr97\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306138 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306158 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-client-ca\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306179 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-config\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306200 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c562cce-90d4-4d8e-a172-9b29678930a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-92nvq\" (UID: \"4c562cce-90d4-4d8e-a172-9b29678930a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306234 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306250 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1378d525-162b-40a3-a2a3-af0dedb9c8b5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306266 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-stats-auth\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306280 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhtgn\" (UniqueName: \"kubernetes.io/projected/0b7fbfea-5829-4958-8427-1182a8aba592-kube-api-access-vhtgn\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306309 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbdt\" (UniqueName: \"kubernetes.io/projected/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-kube-api-access-2jbdt\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6z2h\" (UniqueName: \"kubernetes.io/projected/af4696bf-1ed2-418e-9ff3-478d161d4053-kube-api-access-j6z2h\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-etcd-client\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306354 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-image-import-ca\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306369 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-serving-cert\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306382 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306396 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea675be-b02f-49aa-a817-c50252ba1aed-serving-cert\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306411 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b7fbfea-5829-4958-8427-1182a8aba592-trusted-ca\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306425 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96f4e27-3174-43aa-9297-5a7e22094309-config\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306439 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5wqw\" (UniqueName: \"kubernetes.io/projected/ff39058f-4aad-4477-aa68-0550cd30c2fc-kube-api-access-r5wqw\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd4bq\" (UniqueName: \"kubernetes.io/projected/846c594b-fb0a-4947-bbd4-cf3984892e88-kube-api-access-nd4bq\") pod \"downloads-7954f5f757-fc6hb\" (UID: \"846c594b-fb0a-4947-bbd4-cf3984892e88\") " pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.306553 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7fbfea-5829-4958-8427-1182a8aba592-metrics-tls\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.307814 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-config\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.307874 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-service-ca-bundle\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.307901 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1378d525-162b-40a3-a2a3-af0dedb9c8b5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.307930 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s48d\" (UniqueName: \"kubernetes.io/projected/3b826bc6-e50e-4b2c-8737-254c6d743ad8-kube-api-access-5s48d\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.307959 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594d6206-b063-4d47-b936-027624c9aa1f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.309588 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-service-ca-bundle\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.309773 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-serving-cert\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.309873 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4696bf-1ed2-418e-9ff3-478d161d4053-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.309976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310140 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb5f7d28-9379-41a1-8e43-048ce98115f2-serving-cert\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310236 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-service-ca\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310313 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310384 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-audit-policies\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-etcd-client\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310574 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-serving-cert\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310804 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrgdr\" (UniqueName: \"kubernetes.io/projected/6612a80c-4172-4e7e-bdff-7845ce18e2c9-kube-api-access-nrgdr\") pod \"migrator-59844c95c7-ngvnd\" (UID: \"6612a80c-4172-4e7e-bdff-7845ce18e2c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310856 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6cm\" (UniqueName: \"kubernetes.io/projected/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-kube-api-access-6m6cm\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310877 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jlb\" (UniqueName: \"kubernetes.io/projected/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-kube-api-access-n8jlb\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310898 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-oauth-config\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310915 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxp7f\" (UniqueName: \"kubernetes.io/projected/f96f4e27-3174-43aa-9297-5a7e22094309-kube-api-access-qxp7f\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310933 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310954 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-647dl\" (UniqueName: \"kubernetes.io/projected/bb5f7d28-9379-41a1-8e43-048ce98115f2-kube-api-access-647dl\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310971 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-default-certificate\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.310986 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-auth-proxy-config\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311040 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-ca\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311058 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2gb\" (UniqueName: \"kubernetes.io/projected/ee138e67-5a9e-4e1c-a2d0-58223b44451f-kube-api-access-8g2gb\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311082 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-audit\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311106 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b7fbfea-5829-4958-8427-1182a8aba592-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378d525-162b-40a3-a2a3-af0dedb9c8b5-config\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311182 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-oauth-serving-cert\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.311212 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f96f4e27-3174-43aa-9297-5a7e22094309-trusted-ca\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.312790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-config\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.313423 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.314357 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.314454 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.315044 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.315118 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.315172 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.315376 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-client-ca\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.316401 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.316919 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.319332 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.323518 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8stcv"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.323740 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-client-ca\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.324248 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-config\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.324270 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.324354 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.324955 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.325400 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.325545 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.326222 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.338892 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-serving-cert\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.340126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57a8269-657e-49f2-8edb-189e9f69f1b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.349627 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.354594 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.355176 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.359408 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-58fnv"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.359457 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rjv84"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.361718 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8wzgg"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.362034 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.368674 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mwknl"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.370276 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.371467 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.372473 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9878n"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.373153 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.374184 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.376237 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.377355 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lm4gz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.378838 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q8w48"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.379024 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.380547 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.381705 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fc6hb"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.382885 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qh6th"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.385881 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-phpw5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.390338 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.392817 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xxdg7"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.396218 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.397161 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fqmtz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.402004 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wpkmz"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.405606 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.405787 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.409768 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.411508 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412342 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412399 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b21b018-49bb-4c1f-94db-7c8199012455-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412418 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b826bc6-e50e-4b2c-8737-254c6d743ad8-images\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412468 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96f4e27-3174-43aa-9297-5a7e22094309-serving-cert\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b826bc6-e50e-4b2c-8737-254c6d743ad8-config\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b826bc6-e50e-4b2c-8737-254c6d743ad8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412514 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-encryption-config\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412530 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6xb\" (UniqueName: \"kubernetes.io/projected/f1d3afdb-1d6c-41bb-9203-e2a23a82726e-kube-api-access-9f6xb\") pod \"dns-operator-744455d44c-lpmkg\" (UID: \"f1d3afdb-1d6c-41bb-9203-e2a23a82726e\") " pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412547 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qxv\" (UniqueName: \"kubernetes.io/projected/47a2ded9-7d7e-48b5-b45c-d4adcebc60c1-kube-api-access-b8qxv\") pod \"control-plane-machine-set-operator-78cbb6b69f-g7x76\" (UID: \"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-client\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412581 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412599 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zb2\" (UniqueName: \"kubernetes.io/projected/151149d5-152a-49f8-8c5f-453e68dc4bf5-kube-api-access-g7zb2\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412630 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-config\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412675 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee138e67-5a9e-4e1c-a2d0-58223b44451f-serving-cert\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412746 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-audit-dir\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02adf3f5-bd74-409a-8942-f77cba830901-audit-dir\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412782 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-service-ca\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412838 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412864 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c562cce-90d4-4d8e-a172-9b29678930a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-92nvq\" (UID: \"4c562cce-90d4-4d8e-a172-9b29678930a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412880 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412915 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.412942 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1378d525-162b-40a3-a2a3-af0dedb9c8b5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413001 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-config\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413027 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7432567-ff75-4020-bb78-eebafaa815c6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413049 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413101 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhtgn\" (UniqueName: \"kubernetes.io/projected/0b7fbfea-5829-4958-8427-1182a8aba592-kube-api-access-vhtgn\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413125 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413174 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413200 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-stats-auth\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413546 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b826bc6-e50e-4b2c-8737-254c6d743ad8-images\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6z2h\" (UniqueName: \"kubernetes.io/projected/af4696bf-1ed2-418e-9ff3-478d161d4053-kube-api-access-j6z2h\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413669 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-etcd-client\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-serving-cert\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413722 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea675be-b02f-49aa-a817-c50252ba1aed-serving-cert\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413750 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5wqw\" (UniqueName: \"kubernetes.io/projected/ff39058f-4aad-4477-aa68-0550cd30c2fc-kube-api-access-r5wqw\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413772 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd4bq\" (UniqueName: \"kubernetes.io/projected/846c594b-fb0a-4947-bbd4-cf3984892e88-kube-api-access-nd4bq\") pod \"downloads-7954f5f757-fc6hb\" (UID: \"846c594b-fb0a-4947-bbd4-cf3984892e88\") " pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413797 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7fbfea-5829-4958-8427-1182a8aba592-metrics-tls\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413821 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-service-ca-bundle\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413844 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s48d\" (UniqueName: \"kubernetes.io/projected/3b826bc6-e50e-4b2c-8737-254c6d743ad8-kube-api-access-5s48d\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-serving-cert\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4696bf-1ed2-418e-9ff3-478d161d4053-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413921 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb5f7d28-9379-41a1-8e43-048ce98115f2-serving-cert\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.413999 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-audit-policies\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414024 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrgdr\" (UniqueName: \"kubernetes.io/projected/6612a80c-4172-4e7e-bdff-7845ce18e2c9-kube-api-access-nrgdr\") pod \"migrator-59844c95c7-ngvnd\" (UID: \"6612a80c-4172-4e7e-bdff-7845ce18e2c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414053 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6cm\" (UniqueName: \"kubernetes.io/projected/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-kube-api-access-6m6cm\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414076 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jlb\" (UniqueName: \"kubernetes.io/projected/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-kube-api-access-n8jlb\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414106 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxp7f\" (UniqueName: \"kubernetes.io/projected/f96f4e27-3174-43aa-9297-5a7e22094309-kube-api-access-qxp7f\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414133 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-ca\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414157 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2gb\" (UniqueName: \"kubernetes.io/projected/ee138e67-5a9e-4e1c-a2d0-58223b44451f-kube-api-access-8g2gb\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414216 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378d525-162b-40a3-a2a3-af0dedb9c8b5-config\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414274 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f96f4e27-3174-43aa-9297-5a7e22094309-trusted-ca\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414305 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7432567-ff75-4020-bb78-eebafaa815c6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414358 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/47a2ded9-7d7e-48b5-b45c-d4adcebc60c1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g7x76\" (UID: \"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414383 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80457297-b5b8-4fd5-8d38-70958ec21fd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccg5\" (UniqueName: \"kubernetes.io/projected/6b21b018-49bb-4c1f-94db-7c8199012455-kube-api-access-9ccg5\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414445 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414458 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-config\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414463 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-srv-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414498 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-metrics-certs\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414526 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-config\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414558 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414599 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-etcd-serving-ca\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414619 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvtb\" (UniqueName: \"kubernetes.io/projected/9ea675be-b02f-49aa-a817-c50252ba1aed-kube-api-access-9hvtb\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-machine-approver-tls\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414704 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414725 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414789 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff39058f-4aad-4477-aa68-0550cd30c2fc-audit-dir\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d3afdb-1d6c-41bb-9203-e2a23a82726e-metrics-tls\") pod \"dns-operator-744455d44c-lpmkg\" (UID: \"f1d3afdb-1d6c-41bb-9203-e2a23a82726e\") " pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414844 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-config\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414863 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7432567-ff75-4020-bb78-eebafaa815c6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414935 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nck5\" (UniqueName: \"kubernetes.io/projected/80457297-b5b8-4fd5-8d38-70958ec21fd1-kube-api-access-7nck5\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414955 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.414994 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af4696bf-1ed2-418e-9ff3-478d161d4053-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415024 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b21b018-49bb-4c1f-94db-7c8199012455-proxy-tls\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-config\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415361 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b826bc6-e50e-4b2c-8737-254c6d743ad8-config\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415417 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415440 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415439 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bb5f7d28-9379-41a1-8e43-048ce98115f2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.415518 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.416363 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96f4e27-3174-43aa-9297-5a7e22094309-serving-cert\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.416400 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-ca\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.416420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.416479 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-audit-dir\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417193 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b826bc6-e50e-4b2c-8737-254c6d743ad8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417410 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwtmj\" (UniqueName: \"kubernetes.io/projected/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-kube-api-access-wwtmj\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417448 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbdt\" (UniqueName: \"kubernetes.io/projected/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-kube-api-access-2jbdt\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-image-import-ca\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417560 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417585 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b7fbfea-5829-4958-8427-1182a8aba592-trusted-ca\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417602 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25qq\" (UniqueName: \"kubernetes.io/projected/02adf3f5-bd74-409a-8942-f77cba830901-kube-api-access-z25qq\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417622 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-secret-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417626 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-54mm8"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417659 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96f4e27-3174-43aa-9297-5a7e22094309-config\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417731 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f96f4e27-3174-43aa-9297-5a7e22094309-trusted-ca\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417686 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1378d525-162b-40a3-a2a3-af0dedb9c8b5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417810 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594d6206-b063-4d47-b936-027624c9aa1f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417862 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-service-ca-bundle\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417886 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-service-ca\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417905 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-audit-policies\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417924 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-etcd-client\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417941 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-serving-cert\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-oauth-config\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.417985 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nvl\" (UniqueName: \"kubernetes.io/projected/53121465-80f8-4ed0-bc37-369a780868e1-kube-api-access-m7nvl\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-647dl\" (UniqueName: \"kubernetes.io/projected/bb5f7d28-9379-41a1-8e43-048ce98115f2-kube-api-access-647dl\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418048 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-default-certificate\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418071 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-auth-proxy-config\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418095 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-audit\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418118 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b7fbfea-5829-4958-8427-1182a8aba592-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418141 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-oauth-serving-cert\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418169 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418214 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whfh\" (UniqueName: \"kubernetes.io/projected/594d6206-b063-4d47-b936-027624c9aa1f-kube-api-access-5whfh\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418238 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-encryption-config\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418254 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-client\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418270 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-config\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418265 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcj7x\" (UniqueName: \"kubernetes.io/projected/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-kube-api-access-dcj7x\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418327 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594d6206-b063-4d47-b936-027624c9aa1f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418338 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff39058f-4aad-4477-aa68-0550cd30c2fc-audit-dir\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418358 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-node-pullsecrets\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418383 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-config\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418418 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418445 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-trusted-ca-bundle\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.418474 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsnhw\" (UniqueName: \"kubernetes.io/projected/4c562cce-90d4-4d8e-a172-9b29678930a6-kube-api-access-wsnhw\") pod \"cluster-samples-operator-665b6dd947-92nvq\" (UID: \"4c562cce-90d4-4d8e-a172-9b29678930a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.419091 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-encryption-config\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.419188 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee138e67-5a9e-4e1c-a2d0-58223b44451f-serving-cert\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.419226 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-trusted-ca-bundle\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.419374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ea675be-b02f-49aa-a817-c50252ba1aed-service-ca-bundle\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.419894 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-config\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.420029 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594d6206-b063-4d47-b936-027624c9aa1f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.420082 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-node-pullsecrets\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.420194 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee138e67-5a9e-4e1c-a2d0-58223b44451f-etcd-service-ca\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.420332 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-config\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.420516 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-config\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.421020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-audit-policies\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.421186 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-service-ca\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.421557 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d3afdb-1d6c-41bb-9203-e2a23a82726e-metrics-tls\") pod \"dns-operator-744455d44c-lpmkg\" (UID: \"f1d3afdb-1d6c-41bb-9203-e2a23a82726e\") " pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.422221 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-etcd-serving-ca\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.422465 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-auth-proxy-config\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.422706 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c562cce-90d4-4d8e-a172-9b29678930a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-92nvq\" (UID: \"4c562cce-90d4-4d8e-a172-9b29678930a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423007 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bb5f7d28-9379-41a1-8e43-048ce98115f2-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423148 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-oauth-serving-cert\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423225 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-audit\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423288 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423398 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-serving-cert\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423844 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.423975 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96f4e27-3174-43aa-9297-5a7e22094309-config\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.424355 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-image-import-ca\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.425076 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-machine-approver-tls\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.425438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff39058f-4aad-4477-aa68-0550cd30c2fc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.425524 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-serving-cert\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.425563 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-etcd-client\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.426304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-etcd-client\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.426952 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.427085 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.427545 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-oauth-config\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.427573 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea675be-b02f-49aa-a817-c50252ba1aed-serving-cert\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.428079 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4696bf-1ed2-418e-9ff3-478d161d4053-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.428442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.428593 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff39058f-4aad-4477-aa68-0550cd30c2fc-encryption-config\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.428408 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb5f7d28-9379-41a1-8e43-048ce98115f2-serving-cert\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.429618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af4696bf-1ed2-418e-9ff3-478d161d4053-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.429983 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-metrics-certs\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.430078 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-stats-auth\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.430167 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-service-ca-bundle\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.430221 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-default-certificate\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.430821 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pl76v"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.431932 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.432100 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zw5wq"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.432541 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.433141 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pl76v"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.434162 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.435151 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lpmkg"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.436329 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.437353 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.438640 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.440014 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.441599 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.442839 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mwknl"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.443921 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.444946 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q9qx5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.446781 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.447200 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.447295 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.447957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8stcv"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.449206 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q9qx5"] Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.458751 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7fbfea-5829-4958-8427-1182a8aba592-metrics-tls\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.461907 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b7fbfea-5829-4958-8427-1182a8aba592-trusted-ca\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.462368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594d6206-b063-4d47-b936-027624c9aa1f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.466634 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.486943 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.511357 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519735 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519773 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519812 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519834 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7432567-ff75-4020-bb78-eebafaa815c6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519851 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nck5\" (UniqueName: \"kubernetes.io/projected/80457297-b5b8-4fd5-8d38-70958ec21fd1-kube-api-access-7nck5\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519868 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b21b018-49bb-4c1f-94db-7c8199012455-proxy-tls\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519933 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwtmj\" (UniqueName: \"kubernetes.io/projected/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-kube-api-access-wwtmj\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519953 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25qq\" (UniqueName: \"kubernetes.io/projected/02adf3f5-bd74-409a-8942-f77cba830901-kube-api-access-z25qq\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.519993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-secret-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520020 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nvl\" (UniqueName: \"kubernetes.io/projected/53121465-80f8-4ed0-bc37-369a780868e1-kube-api-access-m7nvl\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520047 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520080 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520105 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b21b018-49bb-4c1f-94db-7c8199012455-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520133 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520183 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02adf3f5-bd74-409a-8942-f77cba830901-audit-dir\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520200 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520225 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520252 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7432567-ff75-4020-bb78-eebafaa815c6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520265 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520285 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520299 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520339 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-audit-policies\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520385 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7432567-ff75-4020-bb78-eebafaa815c6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80457297-b5b8-4fd5-8d38-70958ec21fd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520423 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccg5\" (UniqueName: \"kubernetes.io/projected/6b21b018-49bb-4c1f-94db-7c8199012455-kube-api-access-9ccg5\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520459 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.520479 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-srv-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.521805 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b21b018-49bb-4c1f-94db-7c8199012455-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.521878 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02adf3f5-bd74-409a-8942-f77cba830901-audit-dir\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.524153 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-serving-cert\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.527362 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.547294 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.566560 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.572296 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1378d525-162b-40a3-a2a3-af0dedb9c8b5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.586724 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.607045 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.608590 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378d525-162b-40a3-a2a3-af0dedb9c8b5-config\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.626282 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.631320 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/47a2ded9-7d7e-48b5-b45c-d4adcebc60c1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g7x76\" (UID: \"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.647394 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.666638 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.673944 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-config\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.687006 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.706244 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.717964 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.727350 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.766250 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.788006 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.807476 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.814417 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b21b018-49bb-4c1f-94db-7c8199012455-proxy-tls\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.826435 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.846140 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.867148 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.886916 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.906554 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.927723 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.933431 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7432567-ff75-4020-bb78-eebafaa815c6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.946573 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.968871 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.975185 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7432567-ff75-4020-bb78-eebafaa815c6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:39 crc kubenswrapper[4762]: I0217 14:07:39.986141 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.006523 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.012549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.027340 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.034250 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.046728 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.067076 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.076497 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.090551 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.096295 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.107106 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.114026 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.127341 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.135175 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.146988 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.155200 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.166454 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.173933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.197469 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.205479 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.205913 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.212480 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.241709 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.246992 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.252280 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.266741 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.287834 4762 request.go:700] Waited for 1.006806419s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dregistry-dockercfg-kzzsd&limit=500&resourceVersion=0 Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.289543 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.306542 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.313519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-audit-policies\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.326100 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.346424 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.366203 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.388202 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.406174 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.427487 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.447174 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.484297 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48v72\" (UniqueName: \"kubernetes.io/projected/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-kube-api-access-48v72\") pod \"controller-manager-879f6c89f-58fnv\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.508329 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.514513 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjr97\" (UniqueName: \"kubernetes.io/projected/a57a8269-657e-49f2-8edb-189e9f69f1b4-kube-api-access-qjr97\") pod \"route-controller-manager-6576b87f9c-8gksd\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.516211 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.517427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-secret-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:40 crc kubenswrapper[4762]: E0217 14:07:40.521617 4762 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:07:40 crc kubenswrapper[4762]: E0217 14:07:40.521729 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-srv-cert podName:53121465-80f8-4ed0-bc37-369a780868e1 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:41.021705635 +0000 UTC m=+141.601706287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-srv-cert") pod "olm-operator-6b444d44fb-4h4z7" (UID: "53121465-80f8-4ed0-bc37-369a780868e1") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:07:40 crc kubenswrapper[4762]: E0217 14:07:40.522087 4762 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:07:40 crc kubenswrapper[4762]: E0217 14:07:40.522149 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume podName:3f66bf06-e190-40a2-8503-9e4b5b2f65c6 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:41.022138647 +0000 UTC m=+141.602139299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume") pod "collect-profiles-29522280-ppgsj" (UID: "3f66bf06-e190-40a2-8503-9e4b5b2f65c6") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:07:40 crc kubenswrapper[4762]: E0217 14:07:40.522248 4762 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:07:40 crc kubenswrapper[4762]: E0217 14:07:40.522402 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80457297-b5b8-4fd5-8d38-70958ec21fd1-package-server-manager-serving-cert podName:80457297-b5b8-4fd5-8d38-70958ec21fd1 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:41.022370474 +0000 UTC m=+141.602371126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/80457297-b5b8-4fd5-8d38-70958ec21fd1-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-5jpk8" (UID: "80457297-b5b8-4fd5-8d38-70958ec21fd1") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.532032 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.546028 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.566264 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.586536 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.614091 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.626796 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.640088 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.646934 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.662613 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.666271 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.687910 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.707104 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.727777 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.747110 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.766994 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.786183 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.806742 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.821811 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd"] Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.826434 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.831229 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-58fnv"] Feb 17 14:07:40 crc kubenswrapper[4762]: W0217 14:07:40.839857 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ac2af6_e83a_45b3_b0f3_dbbfe7874c40.slice/crio-816a1f341fc58bc9adfc9fdb1598493e84f557a65a05e024b91f1c3b7c746a1d WatchSource:0}: Error finding container 816a1f341fc58bc9adfc9fdb1598493e84f557a65a05e024b91f1c3b7c746a1d: Status 404 returned error can't find the container with id 816a1f341fc58bc9adfc9fdb1598493e84f557a65a05e024b91f1c3b7c746a1d Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.845868 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.866804 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.886271 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.906850 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.926828 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.946158 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:07:40 crc kubenswrapper[4762]: I0217 14:07:40.967019 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.007261 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.027571 4762 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.045131 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80457297-b5b8-4fd5-8d38-70958ec21fd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.045217 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-srv-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.045965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.046943 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.047844 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.051098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/80457297-b5b8-4fd5-8d38-70958ec21fd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.052553 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53121465-80f8-4ed0-bc37-369a780868e1-srv-cert\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.082093 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6z2h\" (UniqueName: \"kubernetes.io/projected/af4696bf-1ed2-418e-9ff3-478d161d4053-kube-api-access-j6z2h\") pod \"openshift-apiserver-operator-796bbdcf4f-m4jwv\" (UID: \"af4696bf-1ed2-418e-9ff3-478d161d4053\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.111801 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zb2\" (UniqueName: \"kubernetes.io/projected/151149d5-152a-49f8-8c5f-453e68dc4bf5-kube-api-access-g7zb2\") pod \"console-f9d7485db-54mm8\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.128790 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.141366 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6xb\" (UniqueName: \"kubernetes.io/projected/f1d3afdb-1d6c-41bb-9203-e2a23a82726e-kube-api-access-9f6xb\") pod \"dns-operator-744455d44c-lpmkg\" (UID: \"f1d3afdb-1d6c-41bb-9203-e2a23a82726e\") " pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.145182 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qxv\" (UniqueName: \"kubernetes.io/projected/47a2ded9-7d7e-48b5-b45c-d4adcebc60c1-kube-api-access-b8qxv\") pod \"control-plane-machine-set-operator-78cbb6b69f-g7x76\" (UID: \"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.159433 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.162733 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrgdr\" (UniqueName: \"kubernetes.io/projected/6612a80c-4172-4e7e-bdff-7845ce18e2c9-kube-api-access-nrgdr\") pod \"migrator-59844c95c7-ngvnd\" (UID: \"6612a80c-4172-4e7e-bdff-7845ce18e2c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.186835 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6cm\" (UniqueName: \"kubernetes.io/projected/1e5f3005-de4d-4179-ab25-edf5f6b7a6bb-kube-api-access-6m6cm\") pod \"machine-approver-56656f9798-j2kdp\" (UID: \"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.201855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jlb\" (UniqueName: \"kubernetes.io/projected/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-kube-api-access-n8jlb\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.227339 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxp7f\" (UniqueName: \"kubernetes.io/projected/f96f4e27-3174-43aa-9297-5a7e22094309-kube-api-access-qxp7f\") pod \"console-operator-58897d9998-8wzgg\" (UID: \"f96f4e27-3174-43aa-9297-5a7e22094309\") " pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.245038 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1378d525-162b-40a3-a2a3-af0dedb9c8b5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vw9bg\" (UID: \"1378d525-162b-40a3-a2a3-af0dedb9c8b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.258009 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.270971 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2gb\" (UniqueName: \"kubernetes.io/projected/ee138e67-5a9e-4e1c-a2d0-58223b44451f-kube-api-access-8g2gb\") pod \"etcd-operator-b45778765-q8w48\" (UID: \"ee138e67-5a9e-4e1c-a2d0-58223b44451f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.282139 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.285248 4762 request.go:700] Waited for 1.866567902s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.288362 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.288780 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcj7x\" (UniqueName: \"kubernetes.io/projected/5ed1b85f-76bf-4fac-ac4e-eeb448205ad5-kube-api-access-dcj7x\") pod \"apiserver-76f77b778f-fqmtz\" (UID: \"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5\") " pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.295938 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.303304 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.303767 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsnhw\" (UniqueName: \"kubernetes.io/projected/4c562cce-90d4-4d8e-a172-9b29678930a6-kube-api-access-wsnhw\") pod \"cluster-samples-operator-665b6dd947-92nvq\" (UID: \"4c562cce-90d4-4d8e-a172-9b29678930a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.322650 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.323549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-647dl\" (UniqueName: \"kubernetes.io/projected/bb5f7d28-9379-41a1-8e43-048ce98115f2-kube-api-access-647dl\") pod \"openshift-config-operator-7777fb866f-9878n\" (UID: \"bb5f7d28-9379-41a1-8e43-048ce98115f2\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.352998 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-54mm8"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.360403 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhtgn\" (UniqueName: \"kubernetes.io/projected/0b7fbfea-5829-4958-8427-1182a8aba592-kube-api-access-vhtgn\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.364717 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.368575 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b7fbfea-5829-4958-8427-1182a8aba592-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cbkzt\" (UID: \"0b7fbfea-5829-4958-8427-1182a8aba592\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.382040 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvtb\" (UniqueName: \"kubernetes.io/projected/9ea675be-b02f-49aa-a817-c50252ba1aed-kube-api-access-9hvtb\") pod \"authentication-operator-69f744f599-rjv84\" (UID: \"9ea675be-b02f-49aa-a817-c50252ba1aed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.402744 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.406911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ctdpq\" (UID: \"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.422419 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whfh\" (UniqueName: \"kubernetes.io/projected/594d6206-b063-4d47-b936-027624c9aa1f-kube-api-access-5whfh\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2dm4\" (UID: \"594d6206-b063-4d47-b936-027624c9aa1f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:41 crc kubenswrapper[4762]: W0217 14:07:41.427277 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf4696bf_1ed2_418e_9ff3_478d161d4053.slice/crio-ab12f63a30d9d414e2f9118912cb38bf33ef70d1d5acd0c950ca11b574e991eb WatchSource:0}: Error finding container ab12f63a30d9d414e2f9118912cb38bf33ef70d1d5acd0c950ca11b574e991eb: Status 404 returned error can't find the container with id ab12f63a30d9d414e2f9118912cb38bf33ef70d1d5acd0c950ca11b574e991eb Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.445072 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbdt\" (UniqueName: \"kubernetes.io/projected/af9aff26-c327-4fe9-ba97-e7ab3f453fa2-kube-api-access-2jbdt\") pod \"router-default-5444994796-s9l2w\" (UID: \"af9aff26-c327-4fe9-ba97-e7ab3f453fa2\") " pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.453175 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lpmkg"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.467977 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4307b8bb-8c42-45ed-a8bc-d08da6bf92e9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4wzcf\" (UID: \"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.477346 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.483757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s48d\" (UniqueName: \"kubernetes.io/projected/3b826bc6-e50e-4b2c-8737-254c6d743ad8-kube-api-access-5s48d\") pod \"machine-api-operator-5694c8668f-wpkmz\" (UID: \"3b826bc6-e50e-4b2c-8737-254c6d743ad8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.485625 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.492997 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.503023 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.503332 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5wqw\" (UniqueName: \"kubernetes.io/projected/ff39058f-4aad-4477-aa68-0550cd30c2fc-kube-api-access-r5wqw\") pod \"apiserver-7bbb656c7d-x9g8w\" (UID: \"ff39058f-4aad-4477-aa68-0550cd30c2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.509166 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" Feb 17 14:07:41 crc kubenswrapper[4762]: W0217 14:07:41.515803 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d3afdb_1d6c_41bb_9203_e2a23a82726e.slice/crio-cd243cc7034baa56dc0a6ae44a650331fc5f66b3c34532baf94a0d5295c6e553 WatchSource:0}: Error finding container cd243cc7034baa56dc0a6ae44a650331fc5f66b3c34532baf94a0d5295c6e553: Status 404 returned error can't find the container with id cd243cc7034baa56dc0a6ae44a650331fc5f66b3c34532baf94a0d5295c6e553 Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.525388 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd4bq\" (UniqueName: \"kubernetes.io/projected/846c594b-fb0a-4947-bbd4-cf3984892e88-kube-api-access-nd4bq\") pod \"downloads-7954f5f757-fc6hb\" (UID: \"846c594b-fb0a-4947-bbd4-cf3984892e88\") " pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.526769 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.547916 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.554339 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.568230 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.568713 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.574432 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.588894 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.589155 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.591103 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.603797 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.607081 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.629011 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.639475 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" event={"ID":"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb","Type":"ContainerStarted","Data":"960cfc4716bca10e17ec0a0ad43993866ed6b385e90e44c290e2e81eaf4e5ed7"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.641724 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" event={"ID":"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1","Type":"ContainerStarted","Data":"5328a59f6ddf5e4c18e9b2c53c6d8cd021a6b847f83ef93ebfc3324f8d9ec05d"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.645815 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" event={"ID":"1378d525-162b-40a3-a2a3-af0dedb9c8b5","Type":"ContainerStarted","Data":"11771548a2a29c78469415151e5b4e9e54275d16a33284401d930f59c50e7a22"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.646879 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.647605 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" event={"ID":"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40","Type":"ContainerStarted","Data":"8fde1cc2cbe99f8191e2b326908699fbb48ef74fea2039f786b1dc33059b8407"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.647623 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" event={"ID":"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40","Type":"ContainerStarted","Data":"816a1f341fc58bc9adfc9fdb1598493e84f557a65a05e024b91f1c3b7c746a1d"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.647858 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.650165 4762 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-58fnv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.650237 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" podUID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.655446 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-54mm8" event={"ID":"151149d5-152a-49f8-8c5f-453e68dc4bf5","Type":"ContainerStarted","Data":"c6aad3bb942412eed53be77e9ea8cd21deecfc1a2f77ab31f6dd3298a48fe5a7"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.657492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" event={"ID":"af4696bf-1ed2-418e-9ff3-478d161d4053","Type":"ContainerStarted","Data":"ab12f63a30d9d414e2f9118912cb38bf33ef70d1d5acd0c950ca11b574e991eb"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.666634 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" event={"ID":"a57a8269-657e-49f2-8edb-189e9f69f1b4","Type":"ContainerStarted","Data":"d0be7f9a275847575913aafbe2fd9d7e9bfed6f9d3f92e11d83afdf2556453c3"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.666959 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" event={"ID":"a57a8269-657e-49f2-8edb-189e9f69f1b4","Type":"ContainerStarted","Data":"25576de0dbc476e17785eb2deb3ed267114711ee7feca36b6ab70372d4a42c6f"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.666754 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.667214 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.673331 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" event={"ID":"f1d3afdb-1d6c-41bb-9203-e2a23a82726e","Type":"ContainerStarted","Data":"cd243cc7034baa56dc0a6ae44a650331fc5f66b3c34532baf94a0d5295c6e553"} Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.675423 4762 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8gksd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.675465 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" podUID="a57a8269-657e-49f2-8edb-189e9f69f1b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.675558 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.680011 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8wzgg"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.690966 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 14:07:41 crc kubenswrapper[4762]: W0217 14:07:41.699654 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf96f4e27_3174_43aa_9297_5a7e22094309.slice/crio-aeb5a1a23388cb1e3385c16bb1e87a87c5a869a5b8540d80ac879196aa27d429 WatchSource:0}: Error finding container aeb5a1a23388cb1e3385c16bb1e87a87c5a869a5b8540d80ac879196aa27d429: Status 404 returned error can't find the container with id aeb5a1a23388cb1e3385c16bb1e87a87c5a869a5b8540d80ac879196aa27d429 Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.708446 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.748157 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9878n"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.749621 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.753961 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7432567-ff75-4020-bb78-eebafaa815c6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lqnsz\" (UID: \"f7432567-ff75-4020-bb78-eebafaa815c6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.763614 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nck5\" (UniqueName: \"kubernetes.io/projected/80457297-b5b8-4fd5-8d38-70958ec21fd1-kube-api-access-7nck5\") pod \"package-server-manager-789f6589d5-5jpk8\" (UID: \"80457297-b5b8-4fd5-8d38-70958ec21fd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.773535 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.790789 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwtmj\" (UniqueName: \"kubernetes.io/projected/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-kube-api-access-wwtmj\") pod \"collect-profiles-29522280-ppgsj\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.792583 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fqmtz"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.795420 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.812250 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25qq\" (UniqueName: \"kubernetes.io/projected/02adf3f5-bd74-409a-8942-f77cba830901-kube-api-access-z25qq\") pod \"oauth-openshift-558db77b4-phpw5\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.826191 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nvl\" (UniqueName: \"kubernetes.io/projected/53121465-80f8-4ed0-bc37-369a780868e1-kube-api-access-m7nvl\") pod \"olm-operator-6b444d44fb-4h4z7\" (UID: \"53121465-80f8-4ed0-bc37-369a780868e1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.853999 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccg5\" (UniqueName: \"kubernetes.io/projected/6b21b018-49bb-4c1f-94db-7c8199012455-kube-api-access-9ccg5\") pod \"machine-config-controller-84d6567774-nm2sc\" (UID: \"6b21b018-49bb-4c1f-94db-7c8199012455\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.860694 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q8w48"] Feb 17 14:07:41 crc kubenswrapper[4762]: W0217 14:07:41.910192 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb5f7d28_9379_41a1_8e43_048ce98115f2.slice/crio-d53436c5469cef4da49140ac9d4bea6b7663f75f51efda2a2a9dd04df42444ea WatchSource:0}: Error finding container d53436c5469cef4da49140ac9d4bea6b7663f75f51efda2a2a9dd04df42444ea: Status 404 returned error can't find the container with id d53436c5469cef4da49140ac9d4bea6b7663f75f51efda2a2a9dd04df42444ea Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.916376 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.923960 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq"] Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.933910 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.943410 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.959878 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02c7ad77-d801-4f6b-92a9-470b4460d698-images\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960324 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-tls\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960353 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7002df3-a8f7-4a82-8268-f4f5112c94be-srv-cert\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960377 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/f7002df3-a8f7-4a82-8268-f4f5112c94be-kube-api-access-bqtdr\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960433 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2cdcff-72d6-4c93-9157-591b007be2a3-config\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960459 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-bound-sa-token\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960490 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02c7ad77-d801-4f6b-92a9-470b4460d698-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960554 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbplx\" (UniqueName: \"kubernetes.io/projected/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-kube-api-access-bbplx\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsl57\" (UniqueName: \"kubernetes.io/projected/2822ca68-2d20-4f3c-93aa-38f63a418c69-kube-api-access-gsl57\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960670 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04f375f1-7bd2-4b95-b812-9e114d4e7963-signing-key\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7002df3-a8f7-4a82-8268-f4f5112c94be-profile-collector-cert\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960797 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghdmb\" (UniqueName: \"kubernetes.io/projected/3a2cdcff-72d6-4c93-9157-591b007be2a3-kube-api-access-ghdmb\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960823 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5kg5\" (UniqueName: \"kubernetes.io/projected/aac37181-0c34-4fae-b735-d1530b599541-kube-api-access-q5kg5\") pod \"multus-admission-controller-857f4d67dd-qh6th\" (UID: \"aac37181-0c34-4fae-b735-d1530b599541\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960848 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c354ccb-6431-46df-a43d-d3e97f3529ae-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960870 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-certificates\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960908 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2cdcff-72d6-4c93-9157-591b007be2a3-serving-cert\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.960930 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjsrk\" (UniqueName: \"kubernetes.io/projected/04f375f1-7bd2-4b95-b812-9e114d4e7963-kube-api-access-tjsrk\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.961015 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d6e0aaf-bec2-4091-a434-58d6cf2be048-apiservice-cert\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.961084 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.977418 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f9nl\" (UniqueName: \"kubernetes.io/projected/02c7ad77-d801-4f6b-92a9-470b4460d698-kube-api-access-9f9nl\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.977593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.977778 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79k9\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-kube-api-access-p79k9\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.977837 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04f375f1-7bd2-4b95-b812-9e114d4e7963-signing-cabundle\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.977915 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c354ccb-6431-46df-a43d-d3e97f3529ae-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.977949 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0d6e0aaf-bec2-4091-a434-58d6cf2be048-tmpfs\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978071 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978235 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02c7ad77-d801-4f6b-92a9-470b4460d698-proxy-tls\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978574 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978617 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac37181-0c34-4fae-b735-d1530b599541-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qh6th\" (UID: \"aac37181-0c34-4fae-b735-d1530b599541\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978729 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-trusted-ca\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978769 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978799 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d6e0aaf-bec2-4091-a434-58d6cf2be048-webhook-cert\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.978865 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq579\" (UniqueName: \"kubernetes.io/projected/0d6e0aaf-bec2-4091-a434-58d6cf2be048-kube-api-access-tq579\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.996678 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:41 crc kubenswrapper[4762]: E0217 14:07:41.999406 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.499390884 +0000 UTC m=+143.079391536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:41 crc kubenswrapper[4762]: I0217 14:07:41.999902 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.017697 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.042585 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wpkmz"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.081368 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.084907 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.584883149 +0000 UTC m=+143.164883801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.085289 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq579\" (UniqueName: \"kubernetes.io/projected/0d6e0aaf-bec2-4091-a434-58d6cf2be048-kube-api-access-tq579\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.085319 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-socket-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.086335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-registration-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.086605 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12731d21-fa65-4ff9-820e-f961da223378-config-volume\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.086676 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d588f\" (UniqueName: \"kubernetes.io/projected/12731d21-fa65-4ff9-820e-f961da223378-kube-api-access-d588f\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.089904 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/931f2aa0-da21-494b-abe7-9f8b843df3ca-node-bootstrap-token\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.089946 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhlj\" (UniqueName: \"kubernetes.io/projected/12de56fb-5540-495c-b841-5093b7bfb534-kube-api-access-klhlj\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.089972 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02c7ad77-d801-4f6b-92a9-470b4460d698-images\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.089992 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-tls\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7002df3-a8f7-4a82-8268-f4f5112c94be-srv-cert\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090059 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/f7002df3-a8f7-4a82-8268-f4f5112c94be-kube-api-access-bqtdr\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2cdcff-72d6-4c93-9157-591b007be2a3-config\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090135 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-bound-sa-token\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090151 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02c7ad77-d801-4f6b-92a9-470b4460d698-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbplx\" (UniqueName: \"kubernetes.io/projected/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-kube-api-access-bbplx\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090220 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsl57\" (UniqueName: \"kubernetes.io/projected/2822ca68-2d20-4f3c-93aa-38f63a418c69-kube-api-access-gsl57\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090239 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gzpx\" (UniqueName: \"kubernetes.io/projected/931f2aa0-da21-494b-abe7-9f8b843df3ca-kube-api-access-2gzpx\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090263 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04f375f1-7bd2-4b95-b812-9e114d4e7963-signing-key\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090280 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7002df3-a8f7-4a82-8268-f4f5112c94be-profile-collector-cert\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090354 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghdmb\" (UniqueName: \"kubernetes.io/projected/3a2cdcff-72d6-4c93-9157-591b007be2a3-kube-api-access-ghdmb\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090370 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5kg5\" (UniqueName: \"kubernetes.io/projected/aac37181-0c34-4fae-b735-d1530b599541-kube-api-access-q5kg5\") pod \"multus-admission-controller-857f4d67dd-qh6th\" (UID: \"aac37181-0c34-4fae-b735-d1530b599541\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090498 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-mountpoint-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090520 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c354ccb-6431-46df-a43d-d3e97f3529ae-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090537 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-certificates\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090557 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2cdcff-72d6-4c93-9157-591b007be2a3-serving-cert\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090572 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjsrk\" (UniqueName: \"kubernetes.io/projected/04f375f1-7bd2-4b95-b812-9e114d4e7963-kube-api-access-tjsrk\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090611 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d6e0aaf-bec2-4091-a434-58d6cf2be048-apiservice-cert\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090626 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/931f2aa0-da21-494b-abe7-9f8b843df3ca-certs\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090722 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f9nl\" (UniqueName: \"kubernetes.io/projected/02c7ad77-d801-4f6b-92a9-470b4460d698-kube-api-access-9f9nl\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090737 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-csi-data-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090757 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-plugins-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090783 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090834 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p79k9\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-kube-api-access-p79k9\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04f375f1-7bd2-4b95-b812-9e114d4e7963-signing-cabundle\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090867 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0d6e0aaf-bec2-4091-a434-58d6cf2be048-tmpfs\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c354ccb-6431-46df-a43d-d3e97f3529ae-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090925 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090952 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02c7ad77-d801-4f6b-92a9-470b4460d698-proxy-tls\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090969 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8da7c3-3aaf-4256-9183-8f60b7131e6e-cert\") pod \"ingress-canary-q9qx5\" (UID: \"1c8da7c3-3aaf-4256-9183-8f60b7131e6e\") " pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.090984 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12731d21-fa65-4ff9-820e-f961da223378-metrics-tls\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.091056 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdw6\" (UniqueName: \"kubernetes.io/projected/1c8da7c3-3aaf-4256-9183-8f60b7131e6e-kube-api-access-9fdw6\") pod \"ingress-canary-q9qx5\" (UID: \"1c8da7c3-3aaf-4256-9183-8f60b7131e6e\") " pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.091091 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.091134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac37181-0c34-4fae-b735-d1530b599541-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qh6th\" (UID: \"aac37181-0c34-4fae-b735-d1530b599541\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.091156 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-trusted-ca\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.091175 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.091212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d6e0aaf-bec2-4091-a434-58d6cf2be048-webhook-cert\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.094779 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02c7ad77-d801-4f6b-92a9-470b4460d698-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.096471 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02c7ad77-d801-4f6b-92a9-470b4460d698-images\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.100876 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-trusted-ca\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.101275 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.601262594 +0000 UTC m=+143.181263246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.101952 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c354ccb-6431-46df-a43d-d3e97f3529ae-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.102441 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.103601 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2cdcff-72d6-4c93-9157-591b007be2a3-config\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.104273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0d6e0aaf-bec2-4091-a434-58d6cf2be048-tmpfs\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.104280 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04f375f1-7bd2-4b95-b812-9e114d4e7963-signing-cabundle\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.107063 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.107829 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02c7ad77-d801-4f6b-92a9-470b4460d698-proxy-tls\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.108002 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d6e0aaf-bec2-4091-a434-58d6cf2be048-webhook-cert\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.108325 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-certificates\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.119041 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac37181-0c34-4fae-b735-d1530b599541-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qh6th\" (UID: \"aac37181-0c34-4fae-b735-d1530b599541\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.121579 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04f375f1-7bd2-4b95-b812-9e114d4e7963-signing-key\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.122583 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d6e0aaf-bec2-4091-a434-58d6cf2be048-apiservice-cert\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.124142 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7002df3-a8f7-4a82-8268-f4f5112c94be-profile-collector-cert\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.127325 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-tls\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.128009 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.129017 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.135514 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c354ccb-6431-46df-a43d-d3e97f3529ae-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.142039 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq579\" (UniqueName: \"kubernetes.io/projected/0d6e0aaf-bec2-4091-a434-58d6cf2be048-kube-api-access-tq579\") pod \"packageserver-d55dfcdfc-g6gf6\" (UID: \"0d6e0aaf-bec2-4091-a434-58d6cf2be048\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.144100 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2cdcff-72d6-4c93-9157-591b007be2a3-serving-cert\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.146618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7002df3-a8f7-4a82-8268-f4f5112c94be-srv-cert\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.172121 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-bound-sa-token\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.188449 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbplx\" (UniqueName: \"kubernetes.io/projected/8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5-kube-api-access-bbplx\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvbqb\" (UID: \"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192102 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192242 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-csi-data-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192261 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-plugins-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192297 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8da7c3-3aaf-4256-9183-8f60b7131e6e-cert\") pod \"ingress-canary-q9qx5\" (UID: \"1c8da7c3-3aaf-4256-9183-8f60b7131e6e\") " pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192312 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12731d21-fa65-4ff9-820e-f961da223378-metrics-tls\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192333 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fdw6\" (UniqueName: \"kubernetes.io/projected/1c8da7c3-3aaf-4256-9183-8f60b7131e6e-kube-api-access-9fdw6\") pod \"ingress-canary-q9qx5\" (UID: \"1c8da7c3-3aaf-4256-9183-8f60b7131e6e\") " pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192362 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-socket-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192385 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-registration-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12731d21-fa65-4ff9-820e-f961da223378-config-volume\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192418 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d588f\" (UniqueName: \"kubernetes.io/projected/12731d21-fa65-4ff9-820e-f961da223378-kube-api-access-d588f\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192433 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/931f2aa0-da21-494b-abe7-9f8b843df3ca-node-bootstrap-token\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192447 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klhlj\" (UniqueName: \"kubernetes.io/projected/12de56fb-5540-495c-b841-5093b7bfb534-kube-api-access-klhlj\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gzpx\" (UniqueName: \"kubernetes.io/projected/931f2aa0-da21-494b-abe7-9f8b843df3ca-kube-api-access-2gzpx\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192545 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-mountpoint-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.192577 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/931f2aa0-da21-494b-abe7-9f8b843df3ca-certs\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.193218 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-socket-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.193891 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.693868647 +0000 UTC m=+143.273869319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.194629 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-plugins-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.194692 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-csi-data-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.194923 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-mountpoint-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.195475 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12731d21-fa65-4ff9-820e-f961da223378-config-volume\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.195563 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12de56fb-5540-495c-b841-5093b7bfb534-registration-dir\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.196919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8da7c3-3aaf-4256-9183-8f60b7131e6e-cert\") pod \"ingress-canary-q9qx5\" (UID: \"1c8da7c3-3aaf-4256-9183-8f60b7131e6e\") " pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.197377 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/931f2aa0-da21-494b-abe7-9f8b843df3ca-certs\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.198075 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/931f2aa0-da21-494b-abe7-9f8b843df3ca-node-bootstrap-token\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.199787 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12731d21-fa65-4ff9-820e-f961da223378-metrics-tls\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.211481 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsl57\" (UniqueName: \"kubernetes.io/projected/2822ca68-2d20-4f3c-93aa-38f63a418c69-kube-api-access-gsl57\") pod \"marketplace-operator-79b997595-xxdg7\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.229580 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.231290 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5kg5\" (UniqueName: \"kubernetes.io/projected/aac37181-0c34-4fae-b735-d1530b599541-kube-api-access-q5kg5\") pod \"multus-admission-controller-857f4d67dd-qh6th\" (UID: \"aac37181-0c34-4fae-b735-d1530b599541\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.260705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/f7002df3-a8f7-4a82-8268-f4f5112c94be-kube-api-access-bqtdr\") pod \"catalog-operator-68c6474976-b95q5\" (UID: \"f7002df3-a8f7-4a82-8268-f4f5112c94be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.266408 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjsrk\" (UniqueName: \"kubernetes.io/projected/04f375f1-7bd2-4b95-b812-9e114d4e7963-kube-api-access-tjsrk\") pod \"service-ca-9c57cc56f-8stcv\" (UID: \"04f375f1-7bd2-4b95-b812-9e114d4e7963\") " pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.273059 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.279727 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.283593 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p79k9\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-kube-api-access-p79k9\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.293788 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.294163 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.794148872 +0000 UTC m=+143.374149524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.299684 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.312513 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghdmb\" (UniqueName: \"kubernetes.io/projected/3a2cdcff-72d6-4c93-9157-591b007be2a3-kube-api-access-ghdmb\") pod \"service-ca-operator-777779d784-gjmh5\" (UID: \"3a2cdcff-72d6-4c93-9157-591b007be2a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.329377 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.334273 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.338547 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fc6hb"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.350846 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f9nl\" (UniqueName: \"kubernetes.io/projected/02c7ad77-d801-4f6b-92a9-470b4460d698-kube-api-access-9f9nl\") pod \"machine-config-operator-74547568cd-tn8f4\" (UID: \"02c7ad77-d801-4f6b-92a9-470b4460d698\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.353734 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.367692 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gzpx\" (UniqueName: \"kubernetes.io/projected/931f2aa0-da21-494b-abe7-9f8b843df3ca-kube-api-access-2gzpx\") pod \"machine-config-server-zw5wq\" (UID: \"931f2aa0-da21-494b-abe7-9f8b843df3ca\") " pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.375388 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zw5wq" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.389606 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.392013 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.395182 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.395512 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.895497258 +0000 UTC m=+143.475497910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.420390 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhlj\" (UniqueName: \"kubernetes.io/projected/12de56fb-5540-495c-b841-5093b7bfb534-kube-api-access-klhlj\") pod \"csi-hostpathplugin-mwknl\" (UID: \"12de56fb-5540-495c-b841-5093b7bfb534\") " pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.435928 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fdw6\" (UniqueName: \"kubernetes.io/projected/1c8da7c3-3aaf-4256-9183-8f60b7131e6e-kube-api-access-9fdw6\") pod \"ingress-canary-q9qx5\" (UID: \"1c8da7c3-3aaf-4256-9183-8f60b7131e6e\") " pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.449332 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d588f\" (UniqueName: \"kubernetes.io/projected/12731d21-fa65-4ff9-820e-f961da223378-kube-api-access-d588f\") pod \"dns-default-pl76v\" (UID: \"12731d21-fa65-4ff9-820e-f961da223378\") " pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.498599 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.499239 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:42.999225359 +0000 UTC m=+143.579226021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.511203 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.563788 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.599838 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.599987 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.099956298 +0000 UTC m=+143.679956950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.600105 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.600489 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.100478652 +0000 UTC m=+143.680479304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.610229 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.659673 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.668437 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.677244 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q9qx5" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.705383 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.707142 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.207120975 +0000 UTC m=+143.787121627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.730864 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" event={"ID":"af4696bf-1ed2-418e-9ff3-478d161d4053","Type":"ContainerStarted","Data":"c1a8fb1627574ffcef0055ef9cee87e50deabd610c62c26d0699ed9681511517"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.755830 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" event={"ID":"47a2ded9-7d7e-48b5-b45c-d4adcebc60c1","Type":"ContainerStarted","Data":"139945097f557d4bbb82847283f7fe665d1ae680516ab587e693569f37f395ed"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.777902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" event={"ID":"4c562cce-90d4-4d8e-a172-9b29678930a6","Type":"ContainerStarted","Data":"a892622c7f9cccdcb9eec546ae1d765497466d02b3081878fc1c55610b396c0f"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.792055 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.799422 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-phpw5"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.808468 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.809155 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.309141009 +0000 UTC m=+143.889141661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.810188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" event={"ID":"bb5f7d28-9379-41a1-8e43-048ce98115f2","Type":"ContainerStarted","Data":"d53436c5469cef4da49140ac9d4bea6b7663f75f51efda2a2a9dd04df42444ea"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.826304 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" event={"ID":"f1d3afdb-1d6c-41bb-9203-e2a23a82726e","Type":"ContainerStarted","Data":"4d05bb6ef20a9e6afbbf7af248cfde10dbaef8d95804f497b20532da43c6ce70"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.826755 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rjv84"] Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.833522 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" event={"ID":"ff39058f-4aad-4477-aa68-0550cd30c2fc","Type":"ContainerStarted","Data":"7339d57ccaef9a877703b73e776769fca2960f1a4210ab87122668a6b0d62334"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.839725 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" event={"ID":"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb","Type":"ContainerStarted","Data":"3db1c68ec05f174789ce0d83f00d032cd4d83639f7bb9bceebfa9605cc97b431"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.850041 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" event={"ID":"3b826bc6-e50e-4b2c-8737-254c6d743ad8","Type":"ContainerStarted","Data":"34e611a665f8db203f95e266db93627d3035cc0b9d8b932fe00bcb2162bf15bf"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.874449 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" event={"ID":"6612a80c-4172-4e7e-bdff-7845ce18e2c9","Type":"ContainerStarted","Data":"3a5303f02b2ac15ff6aa555675f2b599b9640e720081c5f3be6ecc0903ebf2cf"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.874506 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" event={"ID":"6612a80c-4172-4e7e-bdff-7845ce18e2c9","Type":"ContainerStarted","Data":"adf37727c1d792e524f0c225621ca68bdb1bc0b24e7525508cb471ebf955b71d"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.882693 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" podStartSLOduration=121.882670581 podStartE2EDuration="2m1.882670581s" podCreationTimestamp="2026-02-17 14:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:42.881286453 +0000 UTC m=+143.461287125" watchObservedRunningTime="2026-02-17 14:07:42.882670581 +0000 UTC m=+143.462671233" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.889793 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s9l2w" event={"ID":"af9aff26-c327-4fe9-ba97-e7ab3f453fa2","Type":"ContainerStarted","Data":"e36e09681f3d8068d352b0ba4aa8f03c1c0b23635080e81a29c802873f64e2eb"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.889839 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s9l2w" event={"ID":"af9aff26-c327-4fe9-ba97-e7ab3f453fa2","Type":"ContainerStarted","Data":"50621007686e36e5242f46307c9adf9575345b6b4051697818d8596ea723549b"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.896761 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" event={"ID":"ee138e67-5a9e-4e1c-a2d0-58223b44451f","Type":"ContainerStarted","Data":"96416cca2dd76e41ad7117e80922962a7f8c0a920016c417e82970af69e25a3c"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.902058 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" event={"ID":"1378d525-162b-40a3-a2a3-af0dedb9c8b5","Type":"ContainerStarted","Data":"069d67f1fd0fb79c719032d96d9ba9c11e9d2f531d71e97291295d22214d9571"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.903880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" event={"ID":"f96f4e27-3174-43aa-9297-5a7e22094309","Type":"ContainerStarted","Data":"0767e0f3e2e0ba452ced130f7e8baa5b815ded72959973ce50146fe945db91a3"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.903908 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" event={"ID":"f96f4e27-3174-43aa-9297-5a7e22094309","Type":"ContainerStarted","Data":"aeb5a1a23388cb1e3385c16bb1e87a87c5a869a5b8540d80ac879196aa27d429"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.904660 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.908520 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" event={"ID":"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff","Type":"ContainerStarted","Data":"d8ef4843dd7555b582c56034d02471ad7104b4ee2f486c3a94eda76a120a68d1"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.909370 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:42 crc kubenswrapper[4762]: E0217 14:07:42.910229 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.410215127 +0000 UTC m=+143.990215779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.923981 4762 patch_prober.go:28] interesting pod/console-operator-58897d9998-8wzgg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.924035 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" podUID="f96f4e27-3174-43aa-9297-5a7e22094309" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.944090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" event={"ID":"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5","Type":"ContainerStarted","Data":"27258bad7005f2629a7f8880829e1092153012d2ba6340cef3804e47341dbf54"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.957919 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-54mm8" event={"ID":"151149d5-152a-49f8-8c5f-453e68dc4bf5","Type":"ContainerStarted","Data":"9e696a6f7238329a5d4bccd348be6fc2d7bbdeadbcbf8c2bac2f016c90c416e1"} Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.962441 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:07:42 crc kubenswrapper[4762]: I0217 14:07:42.971524 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.010465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.013267 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.513254489 +0000 UTC m=+144.093255141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.112362 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.112490 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.612462415 +0000 UTC m=+144.192463067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.112819 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.113093 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.613080602 +0000 UTC m=+144.193081254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.174139 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.197192 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.217495 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.218640 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.219242 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.71918275 +0000 UTC m=+144.299183402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.320748 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.321322 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.821310487 +0000 UTC m=+144.401311139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.429463 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.429807 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:43.92979223 +0000 UTC m=+144.509792882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.471241 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xxdg7"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.534519 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.535128 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.035101436 +0000 UTC m=+144.615102088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.569714 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.592808 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.593124 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.636834 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.637158 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.13714235 +0000 UTC m=+144.717143002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.739117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.739480 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.239469133 +0000 UTC m=+144.819469785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.761715 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" podStartSLOduration=123.761698331 podStartE2EDuration="2m3.761698331s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:43.72352183 +0000 UTC m=+144.303522482" watchObservedRunningTime="2026-02-17 14:07:43.761698331 +0000 UTC m=+144.341698983" Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.843114 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.843747 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.343731539 +0000 UTC m=+144.923732191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.860108 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.871313 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5"] Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.944379 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:43 crc kubenswrapper[4762]: E0217 14:07:43.944910 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.44489525 +0000 UTC m=+145.024895902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.973432 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4jwv" podStartSLOduration=123.973417182 podStartE2EDuration="2m3.973417182s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:43.962502039 +0000 UTC m=+144.542502691" watchObservedRunningTime="2026-02-17 14:07:43.973417182 +0000 UTC m=+144.553417834" Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.985323 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" event={"ID":"f7432567-ff75-4020-bb78-eebafaa815c6","Type":"ContainerStarted","Data":"bd75966502ec9ce8ee6b5f0703b77069ae5e17a1b0e0601c44bacb0ca2729174"} Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.986213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fc6hb" event={"ID":"846c594b-fb0a-4947-bbd4-cf3984892e88","Type":"ContainerStarted","Data":"f370ead10ce5c4b0b90330b0f15519fe6cbfccae6a24077ffc956942db7c2d85"} Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.986962 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zw5wq" event={"ID":"931f2aa0-da21-494b-abe7-9f8b843df3ca","Type":"ContainerStarted","Data":"174ece2f2a5517dd884a90deb1c82e89614bb498a3808f9793bff9002d69fff7"} Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.987617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" event={"ID":"6b21b018-49bb-4c1f-94db-7c8199012455","Type":"ContainerStarted","Data":"5fbb96dc53f53240c451499c3204bb2f2ee6a11926f3e47b53bc115f56064fef"} Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.991015 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" event={"ID":"1e5f3005-de4d-4179-ab25-edf5f6b7a6bb","Type":"ContainerStarted","Data":"59b8cf84ec983a999266ef55a9ee87891aa7d9c841221fd41d917e69b102fee5"} Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.994351 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" event={"ID":"80457297-b5b8-4fd5-8d38-70958ec21fd1","Type":"ContainerStarted","Data":"8def0cb11b66988f0eec8e7918662cd207c756031f5f537f07b92de0c2ebc053"} Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.996210 4762 generic.go:334] "Generic (PLEG): container finished" podID="bb5f7d28-9379-41a1-8e43-048ce98115f2" containerID="ffcc3d91769e0364d924d01ffb170c25e4609dd8acfcd83c6448f47d0f35dde5" exitCode=0 Feb 17 14:07:43 crc kubenswrapper[4762]: I0217 14:07:43.996540 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" event={"ID":"bb5f7d28-9379-41a1-8e43-048ce98115f2","Type":"ContainerDied","Data":"ffcc3d91769e0364d924d01ffb170c25e4609dd8acfcd83c6448f47d0f35dde5"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.002368 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-s9l2w" podStartSLOduration=124.002350616 podStartE2EDuration="2m4.002350616s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.001809371 +0000 UTC m=+144.581810033" watchObservedRunningTime="2026-02-17 14:07:44.002350616 +0000 UTC m=+144.582351268" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.005210 4762 generic.go:334] "Generic (PLEG): container finished" podID="5ed1b85f-76bf-4fac-ac4e-eeb448205ad5" containerID="7ef8fd908f07c37238fba697ca7aabfe00c2c443f2e710df6ef6d5abeb1ddacc" exitCode=0 Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.005268 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" event={"ID":"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5","Type":"ContainerDied","Data":"7ef8fd908f07c37238fba697ca7aabfe00c2c443f2e710df6ef6d5abeb1ddacc"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.010334 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" event={"ID":"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9","Type":"ContainerStarted","Data":"1042bdf707739e7b0e1e3672c10bdf0560a6b816bbbe0fbedb6a1ddf08b9d0d5"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.011960 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" event={"ID":"0b7fbfea-5829-4958-8427-1182a8aba592","Type":"ContainerStarted","Data":"5f0cd7daf1682dbbbc738def7081a64ade96a658ca4640ea15aacfe5fc589ebf"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.014100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" event={"ID":"f1d3afdb-1d6c-41bb-9203-e2a23a82726e","Type":"ContainerStarted","Data":"e9e528c855049e0daeaee7fd70cbc7364db4bece7f326fdd6b8b579bfafbb4ed"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.014851 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:44 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:44 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:44 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.014890 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.046010 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.046695 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.546679257 +0000 UTC m=+145.126679909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.048895 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" event={"ID":"594d6206-b063-4d47-b936-027624c9aa1f","Type":"ContainerStarted","Data":"737c5a76ba127a52b92e7151ef83b23b105d379e3a32bb5287ec330785204e6a"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.048935 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" event={"ID":"594d6206-b063-4d47-b936-027624c9aa1f","Type":"ContainerStarted","Data":"7a1d2b4f36cfb8fc69fca07286a1ee2776c1442281a96392352b005d971e82a3"} Feb 17 14:07:44 crc kubenswrapper[4762]: W0217 14:07:44.049312 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d6e0aaf_bec2_4091_a434_58d6cf2be048.slice/crio-acb325ce4d3e741f9143c7530ae32f17acff887c3336149e56de45bc353f6f24 WatchSource:0}: Error finding container acb325ce4d3e741f9143c7530ae32f17acff887c3336149e56de45bc353f6f24: Status 404 returned error can't find the container with id acb325ce4d3e741f9143c7530ae32f17acff887c3336149e56de45bc353f6f24 Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.050600 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vw9bg" podStartSLOduration=124.050591516 podStartE2EDuration="2m4.050591516s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.050199395 +0000 UTC m=+144.630200057" watchObservedRunningTime="2026-02-17 14:07:44.050591516 +0000 UTC m=+144.630592168" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.055966 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" event={"ID":"53121465-80f8-4ed0-bc37-369a780868e1","Type":"ContainerStarted","Data":"f944fae3186c1050b0e3ab17c3c9cba938796ba91aad98d6a7c028420c97249d"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.063448 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" event={"ID":"ee138e67-5a9e-4e1c-a2d0-58223b44451f","Type":"ContainerStarted","Data":"e64e7af64992647bdb01457d62563397727bb4a60e3a3403ee9eac52794669ed"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.090682 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" event={"ID":"6612a80c-4172-4e7e-bdff-7845ce18e2c9","Type":"ContainerStarted","Data":"e305e0a5d5082390a70ab8e0b13bf41160ee53b4fcb3735e0c46b97a93d9827e"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.102950 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" event={"ID":"9ea675be-b02f-49aa-a817-c50252ba1aed","Type":"ContainerStarted","Data":"77728e568e6a5b88d4fb0c74a5e3858e7e42c4fd295e6e1c33a51106aeacccfa"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.103890 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" event={"ID":"3b826bc6-e50e-4b2c-8737-254c6d743ad8","Type":"ContainerStarted","Data":"3a7d15866c8aebd48a373a9c0e65676dcd717a07d013f9283f1c5bfa18ed555e"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.105961 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" event={"ID":"4c562cce-90d4-4d8e-a172-9b29678930a6","Type":"ContainerStarted","Data":"5a98d23222fb20be8f8a15f60dde8181b91a56eb4dfdc6f3b8a3a4b3fe1d8077"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.106695 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" event={"ID":"02adf3f5-bd74-409a-8942-f77cba830901","Type":"ContainerStarted","Data":"439f97fd81cf77e412e0dacf2e7be27738b5a58642ae8b87fd6a21ae4ba02ba1"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.126145 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" event={"ID":"2822ca68-2d20-4f3c-93aa-38f63a418c69","Type":"ContainerStarted","Data":"425ec11b65afba8e7bc2b7b9c11829e3a3d45eb87429259d90d806e5f2f8eeef"} Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.147637 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g7x76" podStartSLOduration=124.147621571 podStartE2EDuration="2m4.147621571s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.102245951 +0000 UTC m=+144.682246603" watchObservedRunningTime="2026-02-17 14:07:44.147621571 +0000 UTC m=+144.727622223" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.160122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.163781 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.66376285 +0000 UTC m=+145.243763602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.194831 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ngvnd" podStartSLOduration=124.194805832 podStartE2EDuration="2m4.194805832s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.181006699 +0000 UTC m=+144.761007361" watchObservedRunningTime="2026-02-17 14:07:44.194805832 +0000 UTC m=+144.774806484" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.247823 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-54mm8" podStartSLOduration=124.247804014 podStartE2EDuration="2m4.247804014s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.246468387 +0000 UTC m=+144.826469039" watchObservedRunningTime="2026-02-17 14:07:44.247804014 +0000 UTC m=+144.827804666" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.261203 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.262248 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.762221555 +0000 UTC m=+145.342222227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.338280 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" podStartSLOduration=124.338258127 podStartE2EDuration="2m4.338258127s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.335157471 +0000 UTC m=+144.915158123" watchObservedRunningTime="2026-02-17 14:07:44.338258127 +0000 UTC m=+144.918258779" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.381277 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.381562 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.88154462 +0000 UTC m=+145.461545322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.382462 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2dm4" podStartSLOduration=124.382452505 podStartE2EDuration="2m4.382452505s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.382154837 +0000 UTC m=+144.962155489" watchObservedRunningTime="2026-02-17 14:07:44.382452505 +0000 UTC m=+144.962453157" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.431405 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.463422 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j2kdp" podStartSLOduration=125.463400843 podStartE2EDuration="2m5.463400843s" podCreationTimestamp="2026-02-17 14:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.453035645 +0000 UTC m=+145.033036297" watchObservedRunningTime="2026-02-17 14:07:44.463400843 +0000 UTC m=+145.043401495" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.464064 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qh6th"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.484119 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.484297 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.984269632 +0000 UTC m=+145.564270284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.484571 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.485006 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:44.984994593 +0000 UTC m=+145.564995235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.502029 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lpmkg" podStartSLOduration=124.502015625 podStartE2EDuration="2m4.502015625s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.501328566 +0000 UTC m=+145.081329218" watchObservedRunningTime="2026-02-17 14:07:44.502015625 +0000 UTC m=+145.082016277" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.505733 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pl76v"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.556587 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mwknl"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.576489 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q8w48" podStartSLOduration=124.576474554 podStartE2EDuration="2m4.576474554s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:44.574976312 +0000 UTC m=+145.154976964" watchObservedRunningTime="2026-02-17 14:07:44.576474554 +0000 UTC m=+145.156475206" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.581876 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8wzgg" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.585731 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.586041 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.086026159 +0000 UTC m=+145.666026811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.588207 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:44 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:44 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:44 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.588378 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.637456 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.660040 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.687415 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.687726 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.187714954 +0000 UTC m=+145.767715606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.700273 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8stcv"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.728896 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q9qx5"] Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.791798 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.792109 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.292093614 +0000 UTC m=+145.872094266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.893323 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.893667 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.393634984 +0000 UTC m=+145.973635646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:44 crc kubenswrapper[4762]: I0217 14:07:44.994195 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:44 crc kubenswrapper[4762]: E0217 14:07:44.995144 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.495124554 +0000 UTC m=+146.075125206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.097008 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.097496 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.597481027 +0000 UTC m=+146.177481679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.141453 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" event={"ID":"3f66bf06-e190-40a2-8503-9e4b5b2f65c6","Type":"ContainerStarted","Data":"0e73ebac43eb08112a89a8fcb17839837bd998e29be38ce59eb17a09f7ff23d0"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.173951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" event={"ID":"80457297-b5b8-4fd5-8d38-70958ec21fd1","Type":"ContainerStarted","Data":"47ddffbd812c0d481b1da45dab146ea03d38107c8b8250a9d507634f8047de39"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.182181 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" event={"ID":"02c7ad77-d801-4f6b-92a9-470b4460d698","Type":"ContainerStarted","Data":"dd22ea2ae40460767e4189e62fd8d111c69b0457590b9f2c8c9e7e38d9f914cd"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.185713 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fc6hb" event={"ID":"846c594b-fb0a-4947-bbd4-cf3984892e88","Type":"ContainerStarted","Data":"bb5ead0cd2c070de6bfa921704769b07e22731b60889c4ac40a4b83795f51f28"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.185793 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.189594 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" event={"ID":"3a2cdcff-72d6-4c93-9157-591b007be2a3","Type":"ContainerStarted","Data":"6748712f456f249e3e0e2b2b58ee055a2889ac3e87960eeae3d7991c44e37873"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.190351 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.190414 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.195479 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" event={"ID":"4307b8bb-8c42-45ed-a8bc-d08da6bf92e9","Type":"ContainerStarted","Data":"886adeb7d4e793679ba3eface4d6a377ba63a416281e396a20366c2086d08dec"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.197904 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.198238 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.698223906 +0000 UTC m=+146.278224548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.207856 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-fc6hb" podStartSLOduration=125.207814052 podStartE2EDuration="2m5.207814052s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.202964258 +0000 UTC m=+145.782965770" watchObservedRunningTime="2026-02-17 14:07:45.207814052 +0000 UTC m=+145.787814704" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.218965 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pl76v" event={"ID":"12731d21-fa65-4ff9-820e-f961da223378","Type":"ContainerStarted","Data":"01ded641cae2894160d27d5d6c10674e7bc7a99e10604c5511f7335a1b13537e"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.242845 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4wzcf" podStartSLOduration=125.242828185 podStartE2EDuration="2m5.242828185s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.241961501 +0000 UTC m=+145.821962173" watchObservedRunningTime="2026-02-17 14:07:45.242828185 +0000 UTC m=+145.822828837" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.247190 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" event={"ID":"3b826bc6-e50e-4b2c-8737-254c6d743ad8","Type":"ContainerStarted","Data":"a43009ade5dc9fadac15288d2aaf5669b824e632dac0915804395f41039b9736"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.262233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q9qx5" event={"ID":"1c8da7c3-3aaf-4256-9183-8f60b7131e6e","Type":"ContainerStarted","Data":"ad230d3450609d50903adb319ba14d26143ac561028cb098cab3d5b59743c80c"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.268839 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" event={"ID":"f7002df3-a8f7-4a82-8268-f4f5112c94be","Type":"ContainerStarted","Data":"6c640ec4d346650bc1f8610312c69bfbc4068aae0165427a8f704e1eb5761a1d"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.268881 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" event={"ID":"f7002df3-a8f7-4a82-8268-f4f5112c94be","Type":"ContainerStarted","Data":"910af6d921141d6a88502d30356274a65782bdd756ae04a1dd2a18ff9d205907"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.269174 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.269983 4762 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-b95q5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.270019 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" podUID="f7002df3-a8f7-4a82-8268-f4f5112c94be" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.304251 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wpkmz" podStartSLOduration=125.304232031 podStartE2EDuration="2m5.304232031s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.303803369 +0000 UTC m=+145.883804021" watchObservedRunningTime="2026-02-17 14:07:45.304232031 +0000 UTC m=+145.884232693" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.304767 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.306036 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.80602109 +0000 UTC m=+146.386021842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.312873 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" event={"ID":"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5","Type":"ContainerStarted","Data":"6d566662e5a9ad0d9c3b6f13a10347b58fccd31a397b15ac9582211b73afb8d2"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.319994 4762 csr.go:261] certificate signing request csr-8b7mm is approved, waiting to be issued Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.345659 4762 csr.go:257] certificate signing request csr-8b7mm is issued Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.351182 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zw5wq" event={"ID":"931f2aa0-da21-494b-abe7-9f8b843df3ca","Type":"ContainerStarted","Data":"f2556b98843b35c75e79a89333c2f51b9531d64025e62eea12443bc4373568a3"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.353367 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" event={"ID":"02adf3f5-bd74-409a-8942-f77cba830901","Type":"ContainerStarted","Data":"85f0e973c0b0d46ffbd369f16c8e1a79167e710ec487da7fc4491673c2138db3"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.354015 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.354862 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" event={"ID":"53121465-80f8-4ed0-bc37-369a780868e1","Type":"ContainerStarted","Data":"79ebcce4803fcb8462d6db3f5f8613b1982ffc39a9026b79a8e6b4c842e78c1d"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.355280 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.355882 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" event={"ID":"aac37181-0c34-4fae-b735-d1530b599541","Type":"ContainerStarted","Data":"f14688446f110ca72f01d0683a9ed740ea08354306d6de52022bcdbd9e5bc254"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.356534 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" event={"ID":"12de56fb-5540-495c-b841-5093b7bfb534","Type":"ContainerStarted","Data":"e72d20262ca56ac1ac3403ee41b9e04433f6a72a3785bff72775db1c600bc622"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.360513 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" event={"ID":"0d6e0aaf-bec2-4091-a434-58d6cf2be048","Type":"ContainerStarted","Data":"acb325ce4d3e741f9143c7530ae32f17acff887c3336149e56de45bc353f6f24"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.364799 4762 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4h4z7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.364860 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" podUID="53121465-80f8-4ed0-bc37-369a780868e1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.365070 4762 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-phpw5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.365087 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" podUID="02adf3f5-bd74-409a-8942-f77cba830901" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.365749 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" event={"ID":"f7432567-ff75-4020-bb78-eebafaa815c6","Type":"ContainerStarted","Data":"d0dbf09579ba1f85399a9416adbeaf1768ea8dfcda412143e7124b96dabc61cb"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.367555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" event={"ID":"6b21b018-49bb-4c1f-94db-7c8199012455","Type":"ContainerStarted","Data":"930b8c8f375a0433da3dbbb1bc5c2938ef089ff2d982d613db31281905514e23"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.373787 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" event={"ID":"4c562cce-90d4-4d8e-a172-9b29678930a6","Type":"ContainerStarted","Data":"1a82505adf4a769e55cc523e787ea46d02630c9adf5f54d4584af1fc45c494db"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.384204 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" podStartSLOduration=125.384182562 podStartE2EDuration="2m5.384182562s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.347021599 +0000 UTC m=+145.927022251" watchObservedRunningTime="2026-02-17 14:07:45.384182562 +0000 UTC m=+145.964183214" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.386313 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zw5wq" podStartSLOduration=6.386306371 podStartE2EDuration="6.386306371s" podCreationTimestamp="2026-02-17 14:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.385701754 +0000 UTC m=+145.965702406" watchObservedRunningTime="2026-02-17 14:07:45.386306371 +0000 UTC m=+145.966307013" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.390370 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" event={"ID":"04f375f1-7bd2-4b95-b812-9e114d4e7963","Type":"ContainerStarted","Data":"c215170cd61fed54c5ef535001053afbcd44189591ea3e9a4e948df66112ae0e"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.394719 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" event={"ID":"bb5f7d28-9379-41a1-8e43-048ce98115f2","Type":"ContainerStarted","Data":"c8a52ce55d28ca5104f9455376c40c0d511e194d6c9f2670374131f7c3cf4a13"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.394768 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.406140 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.407592 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:45.907569871 +0000 UTC m=+146.487570523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.417824 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-92nvq" podStartSLOduration=125.417805606 podStartE2EDuration="2m5.417805606s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.412400465 +0000 UTC m=+145.992401127" watchObservedRunningTime="2026-02-17 14:07:45.417805606 +0000 UTC m=+145.997806248" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.426320 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" event={"ID":"f1ab0bfe-87c9-4bec-9a21-e5b28016d4ff","Type":"ContainerStarted","Data":"ed87d2a7c968eecd3d3240a00397caead9dc6f71b6ce7684f803ad97e3fbc2fc"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.440312 4762 generic.go:334] "Generic (PLEG): container finished" podID="ff39058f-4aad-4477-aa68-0550cd30c2fc" containerID="3a47e49a313a87768f09342f769bdb9cd49a6d7fef9f75f1dca364b286e56476" exitCode=0 Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.440694 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" event={"ID":"ff39058f-4aad-4477-aa68-0550cd30c2fc","Type":"ContainerDied","Data":"3a47e49a313a87768f09342f769bdb9cd49a6d7fef9f75f1dca364b286e56476"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.457675 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" event={"ID":"2822ca68-2d20-4f3c-93aa-38f63a418c69","Type":"ContainerStarted","Data":"7a6ea7dcc9688017aa6d85d9918ae68333a411dddb372839ae3e4d61cf15c960"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.458753 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.461262 4762 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xxdg7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.461925 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.465100 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" podStartSLOduration=125.465077309 podStartE2EDuration="2m5.465077309s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.459208526 +0000 UTC m=+146.039209178" watchObservedRunningTime="2026-02-17 14:07:45.465077309 +0000 UTC m=+146.045077961" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.491608 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lqnsz" podStartSLOduration=125.491590256 podStartE2EDuration="2m5.491590256s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.490049763 +0000 UTC m=+146.070050415" watchObservedRunningTime="2026-02-17 14:07:45.491590256 +0000 UTC m=+146.071590908" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.494705 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" event={"ID":"0b7fbfea-5829-4958-8427-1182a8aba592","Type":"ContainerStarted","Data":"88493afa392dfec15261c3de78bba19ff3660f7890b2c9308d2d0b87054a6bc9"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.504554 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" event={"ID":"9ea675be-b02f-49aa-a817-c50252ba1aed","Type":"ContainerStarted","Data":"ed1b120530cdb5182ded5e55705d0519fee9020c0779ca98ae03931c4c618d7a"} Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.510037 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.511749 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.011735555 +0000 UTC m=+146.591736207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.535583 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" podStartSLOduration=125.535562997 podStartE2EDuration="2m5.535562997s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.535040892 +0000 UTC m=+146.115041544" watchObservedRunningTime="2026-02-17 14:07:45.535562997 +0000 UTC m=+146.115563649" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.581262 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:45 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:45 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:45 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.581554 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.584953 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rjv84" podStartSLOduration=125.584939129 podStartE2EDuration="2m5.584939129s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.584095805 +0000 UTC m=+146.164096457" watchObservedRunningTime="2026-02-17 14:07:45.584939129 +0000 UTC m=+146.164939781" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.617572 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.619706 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.119681694 +0000 UTC m=+146.699682386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.696187 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" podStartSLOduration=125.696158058 podStartE2EDuration="2m5.696158058s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.684857084 +0000 UTC m=+146.264857736" watchObservedRunningTime="2026-02-17 14:07:45.696158058 +0000 UTC m=+146.276158710" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.706817 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ctdpq" podStartSLOduration=125.706798444 podStartE2EDuration="2m5.706798444s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.704464309 +0000 UTC m=+146.284464961" watchObservedRunningTime="2026-02-17 14:07:45.706798444 +0000 UTC m=+146.286799096" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.719405 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.719805 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.219789845 +0000 UTC m=+146.799790497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.770137 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" podStartSLOduration=125.770118483 podStartE2EDuration="2m5.770118483s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.769022313 +0000 UTC m=+146.349022985" watchObservedRunningTime="2026-02-17 14:07:45.770118483 +0000 UTC m=+146.350119135" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.806352 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" podStartSLOduration=125.806329229 podStartE2EDuration="2m5.806329229s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:45.794237013 +0000 UTC m=+146.374237665" watchObservedRunningTime="2026-02-17 14:07:45.806329229 +0000 UTC m=+146.386329881" Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.821338 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.821528 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.32150193 +0000 UTC m=+146.901502582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.821729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.822016 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.322008974 +0000 UTC m=+146.902009626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:45 crc kubenswrapper[4762]: I0217 14:07:45.928016 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:45 crc kubenswrapper[4762]: E0217 14:07:45.928700 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.428683978 +0000 UTC m=+147.008684630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.030493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.031154 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.531134064 +0000 UTC m=+147.111134716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.135854 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.136401 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.636367517 +0000 UTC m=+147.216368169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.237491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.238065 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.738045982 +0000 UTC m=+147.318046704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.338732 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.339176 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.839159661 +0000 UTC m=+147.419160313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.352797 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 14:02:45 +0000 UTC, rotation deadline is 2026-12-16 13:02:25.253598173 +0000 UTC Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.352848 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7246h54m38.900752522s for next certificate rotation Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.440165 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.440490 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:46.940474865 +0000 UTC m=+147.520475517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.519245 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" event={"ID":"80457297-b5b8-4fd5-8d38-70958ec21fd1","Type":"ContainerStarted","Data":"4e806bb210a480cdbf9988c779cee65e00849e21f7b494e1e2dcc88c9bde54d1"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.519659 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.520667 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cbkzt" event={"ID":"0b7fbfea-5829-4958-8427-1182a8aba592","Type":"ContainerStarted","Data":"4e8ac218b4ed02d586c480f0f768f7d9e87a84913bd4b885aca8d08508248f5d"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.522599 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q9qx5" event={"ID":"1c8da7c3-3aaf-4256-9183-8f60b7131e6e","Type":"ContainerStarted","Data":"b4e3a68cf21d560fc40398a2a705869dec52ec534a72a4074f5f117558aaebee"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.524372 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" event={"ID":"3f66bf06-e190-40a2-8503-9e4b5b2f65c6","Type":"ContainerStarted","Data":"ca16c54075c1d04387ef3558088928141f7d5941473278a0cb4f2937f37c7ddc"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.526224 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" event={"ID":"6b21b018-49bb-4c1f-94db-7c8199012455","Type":"ContainerStarted","Data":"b33d19f80cac2ab9b7f69296cf9c9965954cd5f50e95882c5955ce6f26c2cb14"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.527980 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" event={"ID":"ff39058f-4aad-4477-aa68-0550cd30c2fc","Type":"ContainerStarted","Data":"f505d3d3bd4d90bead32095fbe8ae0522d622faa88685eb76fa7386d7293e41c"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.529500 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" event={"ID":"3a2cdcff-72d6-4c93-9157-591b007be2a3","Type":"ContainerStarted","Data":"7b00e6d2a5ecf8011c1feca10381628b556aa910b04561c8d388c06cce9b1792"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.531518 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pl76v" event={"ID":"12731d21-fa65-4ff9-820e-f961da223378","Type":"ContainerStarted","Data":"78c579d8c4d5ca65421c42d17712d6eeda818952794435a0ef91dae449e2e6c2"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.531545 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pl76v" event={"ID":"12731d21-fa65-4ff9-820e-f961da223378","Type":"ContainerStarted","Data":"3243e9c900ad75c306e71eb23e3fc9d7f7ad17279e3684a2280db984d20484cf"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.531613 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.534052 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" event={"ID":"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5","Type":"ContainerStarted","Data":"eb2f669d69df50325665fb0da2f34ab0b12332974b562367ec91ceda5faeb6e1"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.534088 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" event={"ID":"5ed1b85f-76bf-4fac-ac4e-eeb448205ad5","Type":"ContainerStarted","Data":"0048e9b25fe0d8f922e41ca762e622b7b3fc97865865d9e7f3d5e303be2f9760"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.536518 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" event={"ID":"0d6e0aaf-bec2-4091-a434-58d6cf2be048","Type":"ContainerStarted","Data":"e87e870075a484ad2d3fd946b454e3af61f685a844300f9364da057510bbe729"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.536754 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.538206 4762 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g6gf6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.538258 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" podUID="0d6e0aaf-bec2-4091-a434-58d6cf2be048" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.539217 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" event={"ID":"02c7ad77-d801-4f6b-92a9-470b4460d698","Type":"ContainerStarted","Data":"1911cd6b140be14153fab64b22f996bfe9de1fefeb4454588ddfe53df07d4ae2"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.539251 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" event={"ID":"02c7ad77-d801-4f6b-92a9-470b4460d698","Type":"ContainerStarted","Data":"160c64c6e5454c324074d57dd6ed0168b09fca9f66bb49747ed0432e12026ab5"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.542776 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.542973 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.042954372 +0000 UTC m=+147.622955024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.543273 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.543352 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" podStartSLOduration=126.543340573 podStartE2EDuration="2m6.543340573s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.542606392 +0000 UTC m=+147.122607044" watchObservedRunningTime="2026-02-17 14:07:46.543340573 +0000 UTC m=+147.123341225" Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.543516 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.043509148 +0000 UTC m=+147.623509800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.547443 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" event={"ID":"04f375f1-7bd2-4b95-b812-9e114d4e7963","Type":"ContainerStarted","Data":"8b946d5a90c4a239dfd8a55db9345fa25e41da03566ee2570805fd42e9fe541e"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.549504 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" event={"ID":"8ec4d05a-6a11-4e6f-aaf4-0e17540dfeb5","Type":"ContainerStarted","Data":"ca2c2f41908448482a4c63b46787fbe07ce07c4cf2e2a4316bc73421e87c05b0"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.551775 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" event={"ID":"aac37181-0c34-4fae-b735-d1530b599541","Type":"ContainerStarted","Data":"0d7550aabe45aff53d0d728bdc6ea33603ffdb517aeb988cd1747625d9df412b"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.551813 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" event={"ID":"aac37181-0c34-4fae-b735-d1530b599541","Type":"ContainerStarted","Data":"55b1a2f3a9bf5d8ab07b15cb5640f57dd11b85cfdb2b28c0b4cf355394a8d18c"} Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.552289 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.552327 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.552542 4762 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-phpw5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.552570 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" podUID="02adf3f5-bd74-409a-8942-f77cba830901" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.553020 4762 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-b95q5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.553069 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" podUID="f7002df3-a8f7-4a82-8268-f4f5112c94be" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.553183 4762 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4h4z7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.553257 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" podUID="53121465-80f8-4ed0-bc37-369a780868e1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.557383 4762 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xxdg7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.557416 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.563510 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" podStartSLOduration=125.563495113 podStartE2EDuration="2m5.563495113s" podCreationTimestamp="2026-02-17 14:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.561212319 +0000 UTC m=+147.141212971" watchObservedRunningTime="2026-02-17 14:07:46.563495113 +0000 UTC m=+147.143495765" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.579689 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:46 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:46 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:46 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.579755 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.600388 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q9qx5" podStartSLOduration=7.600371137 podStartE2EDuration="7.600371137s" podCreationTimestamp="2026-02-17 14:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.579063775 +0000 UTC m=+147.159064427" watchObservedRunningTime="2026-02-17 14:07:46.600371137 +0000 UTC m=+147.180371789" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.602504 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm2sc" podStartSLOduration=126.60248781600001 podStartE2EDuration="2m6.602487816s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.60010063 +0000 UTC m=+147.180101292" watchObservedRunningTime="2026-02-17 14:07:46.602487816 +0000 UTC m=+147.182488468" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.644808 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.648004 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.14798432 +0000 UTC m=+147.727984972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.689436 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tn8f4" podStartSLOduration=126.689416481 podStartE2EDuration="2m6.689416481s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.647457605 +0000 UTC m=+147.227458257" watchObservedRunningTime="2026-02-17 14:07:46.689416481 +0000 UTC m=+147.269417123" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.743766 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" podStartSLOduration=126.74374888 podStartE2EDuration="2m6.74374888s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.705704413 +0000 UTC m=+147.285705065" watchObservedRunningTime="2026-02-17 14:07:46.74374888 +0000 UTC m=+147.323749532" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.747599 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.747963 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.247950337 +0000 UTC m=+147.827950989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.752890 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.753442 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.757471 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pl76v" podStartSLOduration=7.757445891 podStartE2EDuration="7.757445891s" podCreationTimestamp="2026-02-17 14:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.742380302 +0000 UTC m=+147.322380964" watchObservedRunningTime="2026-02-17 14:07:46.757445891 +0000 UTC m=+147.337446543" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.767812 4762 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-x9g8w container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.767869 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" podUID="ff39058f-4aad-4477-aa68-0550cd30c2fc" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.844752 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" podStartSLOduration=126.844737096 podStartE2EDuration="2m6.844737096s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.810773722 +0000 UTC m=+147.390774384" watchObservedRunningTime="2026-02-17 14:07:46.844737096 +0000 UTC m=+147.424737748" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.849586 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.849873 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.349860378 +0000 UTC m=+147.929861030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.869264 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" podStartSLOduration=126.869247526 podStartE2EDuration="2m6.869247526s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.846303889 +0000 UTC m=+147.426304531" watchObservedRunningTime="2026-02-17 14:07:46.869247526 +0000 UTC m=+147.449248178" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.894596 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gjmh5" podStartSLOduration=125.89457806 podStartE2EDuration="2m5.89457806s" podCreationTimestamp="2026-02-17 14:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.870626495 +0000 UTC m=+147.450627147" watchObservedRunningTime="2026-02-17 14:07:46.89457806 +0000 UTC m=+147.474578712" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.929339 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8stcv" podStartSLOduration=125.929321635 podStartE2EDuration="2m5.929321635s" podCreationTimestamp="2026-02-17 14:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.927874235 +0000 UTC m=+147.507874887" watchObservedRunningTime="2026-02-17 14:07:46.929321635 +0000 UTC m=+147.509322287" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.930733 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvbqb" podStartSLOduration=126.930725354 podStartE2EDuration="2m6.930725354s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.895228368 +0000 UTC m=+147.475229020" watchObservedRunningTime="2026-02-17 14:07:46.930725354 +0000 UTC m=+147.510726006" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.955423 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:46 crc kubenswrapper[4762]: I0217 14:07:46.955413 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qh6th" podStartSLOduration=126.95539508 podStartE2EDuration="2m6.95539508s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:46.95505647 +0000 UTC m=+147.535057132" watchObservedRunningTime="2026-02-17 14:07:46.95539508 +0000 UTC m=+147.535395732" Feb 17 14:07:46 crc kubenswrapper[4762]: E0217 14:07:46.955723 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.455711328 +0000 UTC m=+148.035711990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.056793 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.057010 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.556978532 +0000 UTC m=+148.136979194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.057275 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.057690 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.557680351 +0000 UTC m=+148.137681003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.158455 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.158612 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.658591964 +0000 UTC m=+148.238592636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.158907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.159232 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.659205991 +0000 UTC m=+148.239206643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.259978 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.260345 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.760330901 +0000 UTC m=+148.340331553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.361754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.362109 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.862093397 +0000 UTC m=+148.442094059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.367452 4762 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-9878n container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.367452 4762 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-9878n container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.367516 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" podUID="bb5f7d28-9379-41a1-8e43-048ce98115f2" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.367547 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" podUID="bb5f7d28-9379-41a1-8e43-048ce98115f2" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.474613 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.475052 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:47.975033275 +0000 UTC m=+148.555033927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.559043 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" event={"ID":"12de56fb-5540-495c-b841-5093b7bfb534","Type":"ContainerStarted","Data":"ea8def2253cb059918d2b67ac79630ff0ab6e1997eb57ab8f12943beb17663d0"} Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.559739 4762 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g6gf6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.559801 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" podUID="0d6e0aaf-bec2-4091-a434-58d6cf2be048" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.559971 4762 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xxdg7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.560008 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.572240 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:47 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:47 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:47 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.572310 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.576315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.576811 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.076789962 +0000 UTC m=+148.656790684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.629371 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4h4z7" Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.677503 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.677734 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.177688305 +0000 UTC m=+148.757688967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.678467 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.680997 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.180984396 +0000 UTC m=+148.760985048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.782254 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.782438 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.282405044 +0000 UTC m=+148.862405696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.782859 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.783158 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.283145544 +0000 UTC m=+148.863146196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.884454 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.884604 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.384582202 +0000 UTC m=+148.964582854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.884827 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.885178 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.385162208 +0000 UTC m=+148.965162860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.986370 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.986547 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.486524634 +0000 UTC m=+149.066525286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:47 crc kubenswrapper[4762]: I0217 14:07:47.986628 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:47 crc kubenswrapper[4762]: E0217 14:07:47.986954 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.486945306 +0000 UTC m=+149.066945958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.088255 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.088474 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.089334 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.589302678 +0000 UTC m=+149.169303330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.091473 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.190017 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.190071 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.190111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.190179 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.190511 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.690494789 +0000 UTC m=+149.270495441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.197190 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.197203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.205488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.283554 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.291036 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.292202 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.292357 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.792332178 +0000 UTC m=+149.372332830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.292557 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.292928 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.792917544 +0000 UTC m=+149.372918196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.388118 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.393971 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.394087 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.894068664 +0000 UTC m=+149.474069316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.394386 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.394807 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.894797415 +0000 UTC m=+149.474798077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.495350 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.499354 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.499809 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:48.999791091 +0000 UTC m=+149.579791743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.590888 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:48 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:48 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:48 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.590943 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.600709 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.601134 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.101118126 +0000 UTC m=+149.681118778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.705323 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.706454 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.206435102 +0000 UTC m=+149.786435764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.818176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.818762 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.318750632 +0000 UTC m=+149.898751284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.919972 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.920141 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.420117098 +0000 UTC m=+150.000117750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:48 crc kubenswrapper[4762]: I0217 14:07:48.920194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:48 crc kubenswrapper[4762]: E0217 14:07:48.920483 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.420475828 +0000 UTC m=+150.000476470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.021368 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.021766 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.521742601 +0000 UTC m=+150.101743263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: W0217 14:07:49.076421 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9ff7ef1f67944cf2d989dabdce9ca0896df57d10d3f460ff419ae92658275bc0 WatchSource:0}: Error finding container 9ff7ef1f67944cf2d989dabdce9ca0896df57d10d3f460ff419ae92658275bc0: Status 404 returned error can't find the container with id 9ff7ef1f67944cf2d989dabdce9ca0896df57d10d3f460ff419ae92658275bc0 Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.105902 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.109667 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.118092 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.118814 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.118975 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.123335 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.123778 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.623762885 +0000 UTC m=+150.203763537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.225427 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.225549 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.725526692 +0000 UTC m=+150.305527344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.225667 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.225747 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.225826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.226140 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.726133099 +0000 UTC m=+150.306133741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.329111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.329382 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.329491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.329917 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.829903222 +0000 UTC m=+150.409903874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.329948 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.360476 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.430922 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.431289 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:49.931273788 +0000 UTC m=+150.511274440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.440553 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5h5kh"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.441878 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.449033 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.460517 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.462566 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5h5kh"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.532016 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.532491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjh2\" (UniqueName: \"kubernetes.io/projected/ea39a651-661f-4d01-9420-71469f5d2b8c-kube-api-access-tpjh2\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.532521 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-catalog-content\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.532546 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-utilities\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.532743 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.032726336 +0000 UTC m=+150.612726988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.578788 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:49 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:49 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:49 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.578841 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.601665 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" event={"ID":"12de56fb-5540-495c-b841-5093b7bfb534","Type":"ContainerStarted","Data":"8aa63a9f4f024ab6495874d9b9ff730d1733be512aa221df9f7d451836509764"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.602704 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"98ccfbd0236a407e8acb9c0921edb8a872aff1dee6c10ccf6b0e4d3f5e2ad8a7"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.602733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"819122b125ad7516465942869d3d089a09b7026e3a5549a8bc19b220bcfec9dc"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.611953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5b156b529da077fafe0f714193801d1e210638b61d333cd0c2f2ae39b88373f6"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.611992 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9ff7ef1f67944cf2d989dabdce9ca0896df57d10d3f460ff419ae92658275bc0"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.612590 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.614292 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9819c218b3e64d37de9fbe0c1b1cf2e20d75719f9951f46308140baedcca4aa8"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.614315 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9770c27a4943718aced17e168da9e01187a3755ac4a778765e1a19f5467fd346"} Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.633810 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-catalog-content\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.633852 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-utilities\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.633929 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.633969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjh2\" (UniqueName: \"kubernetes.io/projected/ea39a651-661f-4d01-9420-71469f5d2b8c-kube-api-access-tpjh2\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.634652 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-catalog-content\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.634857 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-utilities\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.635088 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.13507764 +0000 UTC m=+150.715078292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.640169 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpj7t"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.641060 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.655253 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.669911 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpj7t"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.701908 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjh2\" (UniqueName: \"kubernetes.io/projected/ea39a651-661f-4d01-9420-71469f5d2b8c-kube-api-access-tpjh2\") pod \"certified-operators-5h5kh\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.745853 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.746121 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-catalog-content\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.746214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t5ld\" (UniqueName: \"kubernetes.io/projected/17efb526-3519-4d99-bd81-cd6fed3a42aa-kube-api-access-6t5ld\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.746243 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-utilities\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.746348 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.24632943 +0000 UTC m=+150.826330072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.769073 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.836034 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-66rsm"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.836928 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.849697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t5ld\" (UniqueName: \"kubernetes.io/projected/17efb526-3519-4d99-bd81-cd6fed3a42aa-kube-api-access-6t5ld\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.849752 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-utilities\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.849809 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-catalog-content\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.849836 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.850108 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.350094893 +0000 UTC m=+150.930095545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.850853 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-catalog-content\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.851161 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-utilities\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.881270 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t5ld\" (UniqueName: \"kubernetes.io/projected/17efb526-3519-4d99-bd81-cd6fed3a42aa-kube-api-access-6t5ld\") pod \"community-operators-qpj7t\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.902899 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66rsm"] Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.953656 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.953866 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.453840145 +0000 UTC m=+151.033840797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.953913 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.953960 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddm66\" (UniqueName: \"kubernetes.io/projected/8fcc9b44-0a23-4690-8620-ede69e43a7f4-kube-api-access-ddm66\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.953979 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-catalog-content\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.954016 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-utilities\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:49 crc kubenswrapper[4762]: E0217 14:07:49.954267 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.454256526 +0000 UTC m=+151.034257178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.967832 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:07:49 crc kubenswrapper[4762]: I0217 14:07:49.995574 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:07:50 crc kubenswrapper[4762]: W0217 14:07:50.002904 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3eb322ab_0cf1_448e_8e5a_fbd14f55a267.slice/crio-4de851ed6314ad76d956e9be1dd691fb698c81d1aa4ff3f6a4b683175c5813d7 WatchSource:0}: Error finding container 4de851ed6314ad76d956e9be1dd691fb698c81d1aa4ff3f6a4b683175c5813d7: Status 404 returned error can't find the container with id 4de851ed6314ad76d956e9be1dd691fb698c81d1aa4ff3f6a4b683175c5813d7 Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.023276 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j27jc"] Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.024469 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.035945 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j27jc"] Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.055829 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056138 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-utilities\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056167 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-utilities\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: E0217 14:07:50.056221 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.556197528 +0000 UTC m=+151.136198200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056356 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056438 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlnm\" (UniqueName: \"kubernetes.io/projected/4505d245-d558-4112-893d-75b19c128b09-kube-api-access-8zlnm\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-catalog-content\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddm66\" (UniqueName: \"kubernetes.io/projected/8fcc9b44-0a23-4690-8620-ede69e43a7f4-kube-api-access-ddm66\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056548 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-catalog-content\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: E0217 14:07:50.056750 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.556737943 +0000 UTC m=+151.136738595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056909 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-utilities\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.056929 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-catalog-content\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.098376 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddm66\" (UniqueName: \"kubernetes.io/projected/8fcc9b44-0a23-4690-8620-ede69e43a7f4-kube-api-access-ddm66\") pod \"certified-operators-66rsm\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.142509 4762 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.157022 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.157225 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlnm\" (UniqueName: \"kubernetes.io/projected/4505d245-d558-4112-893d-75b19c128b09-kube-api-access-8zlnm\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.157247 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-catalog-content\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.157295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-utilities\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.157753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-utilities\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: E0217 14:07:50.157819 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.657805011 +0000 UTC m=+151.237805663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.158222 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-catalog-content\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.184751 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.191515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlnm\" (UniqueName: \"kubernetes.io/projected/4505d245-d558-4112-893d-75b19c128b09-kube-api-access-8zlnm\") pod \"community-operators-j27jc\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.242802 4762 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T14:07:50.142533036Z","Handler":null,"Name":""} Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.261915 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:50 crc kubenswrapper[4762]: E0217 14:07:50.262459 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:07:50.762448618 +0000 UTC m=+151.342449270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lm4gz" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.268230 4762 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.268256 4762 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.283989 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.284586 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.289071 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.289724 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.307853 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.317411 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5h5kh"] Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.378163 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.378369 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.378427 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.380624 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.391331 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9878n" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.433627 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.481273 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.481352 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.481406 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.481966 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.528593 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.528628 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.538392 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.565093 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpj7t"] Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.580249 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:50 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:50 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:50 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.580304 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.636571 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.641316 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lm4gz\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.641874 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerStarted","Data":"50431a81480dca1d5aa8be321acb74024d022bb437e7fdb55f27dcaa9320d695"} Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.649011 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerStarted","Data":"7b02ff8b3474fab42237600397818b6b5adf0275ac76d12b1825a56fc9933952"} Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.651188 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.652981 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" event={"ID":"12de56fb-5540-495c-b841-5093b7bfb534","Type":"ContainerStarted","Data":"1c421763f8d993bb0f1b153b1bf8f8279a2aff2c0228c2b4fb36389163d84c60"} Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.653018 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" event={"ID":"12de56fb-5540-495c-b841-5093b7bfb534","Type":"ContainerStarted","Data":"28a170ed94fb91ff72b6f35b8bff86db3d37e03400ca5a1f63db6c14b179868d"} Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.682359 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3eb322ab-0cf1-448e-8e5a-fbd14f55a267","Type":"ContainerStarted","Data":"4de851ed6314ad76d956e9be1dd691fb698c81d1aa4ff3f6a4b683175c5813d7"} Feb 17 14:07:50 crc kubenswrapper[4762]: I0217 14:07:50.698970 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mwknl" podStartSLOduration=11.698944753 podStartE2EDuration="11.698944753s" podCreationTimestamp="2026-02-17 14:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:50.693991136 +0000 UTC m=+151.273991808" watchObservedRunningTime="2026-02-17 14:07:50.698944753 +0000 UTC m=+151.278945405" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.004586 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66rsm"] Feb 17 14:07:51 crc kubenswrapper[4762]: E0217 14:07:51.024994 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea39a651_661f_4d01_9420_71469f5d2b8c.slice/crio-a3917a426f245b435d453bce4d32b069cf10e28751f43a04699450c57e15258d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea39a651_661f_4d01_9420_71469f5d2b8c.slice/crio-conmon-a3917a426f245b435d453bce4d32b069cf10e28751f43a04699450c57e15258d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17efb526_3519_4d99_bd81_cd6fed3a42aa.slice/crio-conmon-2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.085134 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j27jc"] Feb 17 14:07:51 crc kubenswrapper[4762]: W0217 14:07:51.097920 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4505d245_d558_4112_893d_75b19c128b09.slice/crio-17942061e8fb438a8e5ca86f8e63a1e22bd2d2eca4d345272307d11046eca8a8 WatchSource:0}: Error finding container 17942061e8fb438a8e5ca86f8e63a1e22bd2d2eca4d345272307d11046eca8a8: Status 404 returned error can't find the container with id 17942061e8fb438a8e5ca86f8e63a1e22bd2d2eca4d345272307d11046eca8a8 Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.132148 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.160624 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.160926 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.164725 4762 patch_prober.go:28] interesting pod/console-f9d7485db-54mm8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.164779 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-54mm8" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.188192 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lm4gz"] Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.478235 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.478321 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.487106 4762 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fqmtz container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]log ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]etcd ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 17 14:07:51 crc kubenswrapper[4762]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 17 14:07:51 crc kubenswrapper[4762]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 17 14:07:51 crc kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 14:07:51 crc kubenswrapper[4762]: livez check failed Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.487176 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" podUID="5ed1b85f-76bf-4fac-ac4e-eeb448205ad5" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.569206 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.572437 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:51 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:51 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:51 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.572516 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.626777 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lb2z7"] Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.628384 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.634289 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.646957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb2z7"] Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.687723 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" event={"ID":"6c354ccb-6431-46df-a43d-d3e97f3529ae","Type":"ContainerStarted","Data":"364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.687761 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" event={"ID":"6c354ccb-6431-46df-a43d-d3e97f3529ae","Type":"ContainerStarted","Data":"9b5980c9d8a065bcd4209997c1ae2ce7fe63b4f509b7f39019b517657c34910b"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.687799 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.689144 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3eb322ab-0cf1-448e-8e5a-fbd14f55a267","Type":"ContainerStarted","Data":"f926e3dcc8a7364f256c4f375db2834013b1fe4bac7ddad5072b9063c017f06a"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.690535 4762 generic.go:334] "Generic (PLEG): container finished" podID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerID="a3917a426f245b435d453bce4d32b069cf10e28751f43a04699450c57e15258d" exitCode=0 Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.690588 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerDied","Data":"a3917a426f245b435d453bce4d32b069cf10e28751f43a04699450c57e15258d"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.692674 4762 generic.go:334] "Generic (PLEG): container finished" podID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerID="2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762" exitCode=0 Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.692747 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.692818 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerDied","Data":"2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.695107 4762 generic.go:334] "Generic (PLEG): container finished" podID="4505d245-d558-4112-893d-75b19c128b09" containerID="b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77" exitCode=0 Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.695158 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerDied","Data":"b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.695176 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerStarted","Data":"17942061e8fb438a8e5ca86f8e63a1e22bd2d2eca4d345272307d11046eca8a8"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.697474 4762 generic.go:334] "Generic (PLEG): container finished" podID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerID="0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31" exitCode=0 Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.697530 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerDied","Data":"0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.697549 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerStarted","Data":"244f21222bf6a06cdf751507bfeb4bbf88c40e93bd7c7e7f71473ef2b7812688"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.706615 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3","Type":"ContainerStarted","Data":"178aee745e19620050f8c15f71bc788ab6c1fe8b01f430e44d9e5f9ab968a9f3"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.706671 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3","Type":"ContainerStarted","Data":"3cf90fa32a1f32955e3203775d8f2aa9316f8a5cb9131a8cddf0b726dda72928"} Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.713762 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-utilities\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.713822 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-catalog-content\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.713850 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqzs\" (UniqueName: \"kubernetes.io/projected/23c1ddb0-986c-4801-9172-0f372eebae07-kube-api-access-5sqzs\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.733935 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" podStartSLOduration=131.733913404 podStartE2EDuration="2m11.733913404s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:51.713041484 +0000 UTC m=+152.293042146" watchObservedRunningTime="2026-02-17 14:07:51.733913404 +0000 UTC m=+152.313914056" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.757957 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.757939061 podStartE2EDuration="1.757939061s" podCreationTimestamp="2026-02-17 14:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:51.754889226 +0000 UTC m=+152.334889878" watchObservedRunningTime="2026-02-17 14:07:51.757939061 +0000 UTC m=+152.337939713" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.764338 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.772530 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x9g8w" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.774983 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.774962634 podStartE2EDuration="2.774962634s" podCreationTimestamp="2026-02-17 14:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:51.77480942 +0000 UTC m=+152.354810072" watchObservedRunningTime="2026-02-17 14:07:51.774962634 +0000 UTC m=+152.354963286" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.797093 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.797147 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.797191 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.797252 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.815762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-utilities\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.816895 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-utilities\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.835830 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-catalog-content\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.835912 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqzs\" (UniqueName: \"kubernetes.io/projected/23c1ddb0-986c-4801-9172-0f372eebae07-kube-api-access-5sqzs\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.837111 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-catalog-content\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.880027 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqzs\" (UniqueName: \"kubernetes.io/projected/23c1ddb0-986c-4801-9172-0f372eebae07-kube-api-access-5sqzs\") pod \"redhat-marketplace-lb2z7\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:51 crc kubenswrapper[4762]: I0217 14:07:51.943390 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.028928 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q7zdn"] Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.030283 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.044225 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7zdn"] Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.085199 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.144173 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-utilities\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.144233 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-catalog-content\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.144325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsn6p\" (UniqueName: \"kubernetes.io/projected/a1770df5-1061-4617-91ae-3909f5fe514f-kube-api-access-jsn6p\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.245913 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-utilities\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.245959 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-catalog-content\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.245994 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsn6p\" (UniqueName: \"kubernetes.io/projected/a1770df5-1061-4617-91ae-3909f5fe514f-kube-api-access-jsn6p\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.247812 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-utilities\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.248100 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-catalog-content\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.253987 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g6gf6" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.268322 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsn6p\" (UniqueName: \"kubernetes.io/projected/a1770df5-1061-4617-91ae-3909f5fe514f-kube-api-access-jsn6p\") pod \"redhat-marketplace-q7zdn\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.283525 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.344290 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b95q5" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.352258 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.377338 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb2z7"] Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.572833 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:52 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:52 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:52 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.573235 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.637496 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-28cgn"] Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.640754 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.642296 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28cgn"] Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.643017 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.703513 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7zdn"] Feb 17 14:07:52 crc kubenswrapper[4762]: W0217 14:07:52.713915 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1770df5_1061_4617_91ae_3909f5fe514f.slice/crio-3de3f28d7b4934a0b540b1578eed346837435b4b6940f8b9ef45d3b97142cd7d WatchSource:0}: Error finding container 3de3f28d7b4934a0b540b1578eed346837435b4b6940f8b9ef45d3b97142cd7d: Status 404 returned error can't find the container with id 3de3f28d7b4934a0b540b1578eed346837435b4b6940f8b9ef45d3b97142cd7d Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.762378 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f66bf06-e190-40a2-8503-9e4b5b2f65c6" containerID="ca16c54075c1d04387ef3558088928141f7d5941473278a0cb4f2937f37c7ddc" exitCode=0 Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.762457 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" event={"ID":"3f66bf06-e190-40a2-8503-9e4b5b2f65c6","Type":"ContainerDied","Data":"ca16c54075c1d04387ef3558088928141f7d5941473278a0cb4f2937f37c7ddc"} Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.769329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mggzp\" (UniqueName: \"kubernetes.io/projected/490d6026-4fbb-49b1-993c-09dd3e60db65-kube-api-access-mggzp\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.769426 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-catalog-content\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.769457 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-utilities\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.773978 4762 generic.go:334] "Generic (PLEG): container finished" podID="23c1ddb0-986c-4801-9172-0f372eebae07" containerID="4f17dc0df37f3cd997ff008f30518b534ddf83822773d5e1bcf48f229630bbc6" exitCode=0 Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.774043 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerDied","Data":"4f17dc0df37f3cd997ff008f30518b534ddf83822773d5e1bcf48f229630bbc6"} Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.774074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerStarted","Data":"f977244a4c9ab995537d8980dba05a1b1b3ec3d4364b7c182eec382a42012338"} Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.788942 4762 generic.go:334] "Generic (PLEG): container finished" podID="4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3" containerID="178aee745e19620050f8c15f71bc788ab6c1fe8b01f430e44d9e5f9ab968a9f3" exitCode=0 Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.789148 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3","Type":"ContainerDied","Data":"178aee745e19620050f8c15f71bc788ab6c1fe8b01f430e44d9e5f9ab968a9f3"} Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.792926 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3eb322ab-0cf1-448e-8e5a-fbd14f55a267","Type":"ContainerDied","Data":"f926e3dcc8a7364f256c4f375db2834013b1fe4bac7ddad5072b9063c017f06a"} Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.793201 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb322ab-0cf1-448e-8e5a-fbd14f55a267" containerID="f926e3dcc8a7364f256c4f375db2834013b1fe4bac7ddad5072b9063c017f06a" exitCode=0 Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.870551 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-catalog-content\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.870609 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-utilities\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.870707 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mggzp\" (UniqueName: \"kubernetes.io/projected/490d6026-4fbb-49b1-993c-09dd3e60db65-kube-api-access-mggzp\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.874793 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-catalog-content\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.875325 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-utilities\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:52 crc kubenswrapper[4762]: I0217 14:07:52.921794 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mggzp\" (UniqueName: \"kubernetes.io/projected/490d6026-4fbb-49b1-993c-09dd3e60db65-kube-api-access-mggzp\") pod \"redhat-operators-28cgn\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.007233 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.035592 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hv4vz"] Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.036969 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.068971 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hv4vz"] Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.175431 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4jg\" (UniqueName: \"kubernetes.io/projected/2f1332eb-9672-4d20-b2e4-4d26287d6464-kube-api-access-tt4jg\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.175557 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-utilities\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.175604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-catalog-content\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.279495 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-catalog-content\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.279575 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4jg\" (UniqueName: \"kubernetes.io/projected/2f1332eb-9672-4d20-b2e4-4d26287d6464-kube-api-access-tt4jg\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.279660 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-utilities\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.280164 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-utilities\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.280458 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-catalog-content\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.332587 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4jg\" (UniqueName: \"kubernetes.io/projected/2f1332eb-9672-4d20-b2e4-4d26287d6464-kube-api-access-tt4jg\") pod \"redhat-operators-hv4vz\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.419130 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.443350 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28cgn"] Feb 17 14:07:53 crc kubenswrapper[4762]: W0217 14:07:53.467598 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490d6026_4fbb_49b1_993c_09dd3e60db65.slice/crio-95b7e3a89d7aa8fadf37ea9bf243e120b4c22021f16b6095b9fc4ba4e9574fa0 WatchSource:0}: Error finding container 95b7e3a89d7aa8fadf37ea9bf243e120b4c22021f16b6095b9fc4ba4e9574fa0: Status 404 returned error can't find the container with id 95b7e3a89d7aa8fadf37ea9bf243e120b4c22021f16b6095b9fc4ba4e9574fa0 Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.575110 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:53 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:53 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:53 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.575166 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.812144 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerStarted","Data":"95b7e3a89d7aa8fadf37ea9bf243e120b4c22021f16b6095b9fc4ba4e9574fa0"} Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.814830 4762 generic.go:334] "Generic (PLEG): container finished" podID="a1770df5-1061-4617-91ae-3909f5fe514f" containerID="9c84f9c706f800efebe3783429ec9d551d4a7e4cf2786d005b3382c519c861bb" exitCode=0 Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.815248 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerDied","Data":"9c84f9c706f800efebe3783429ec9d551d4a7e4cf2786d005b3382c519c861bb"} Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.815280 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerStarted","Data":"3de3f28d7b4934a0b540b1578eed346837435b4b6940f8b9ef45d3b97142cd7d"} Feb 17 14:07:53 crc kubenswrapper[4762]: I0217 14:07:53.816743 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hv4vz"] Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.162910 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.322624 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume\") pod \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.322684 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-secret-volume\") pod \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.322717 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwtmj\" (UniqueName: \"kubernetes.io/projected/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-kube-api-access-wwtmj\") pod \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\" (UID: \"3f66bf06-e190-40a2-8503-9e4b5b2f65c6\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.323918 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f66bf06-e190-40a2-8503-9e4b5b2f65c6" (UID: "3f66bf06-e190-40a2-8503-9e4b5b2f65c6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.331347 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-kube-api-access-wwtmj" (OuterVolumeSpecName: "kube-api-access-wwtmj") pod "3f66bf06-e190-40a2-8503-9e4b5b2f65c6" (UID: "3f66bf06-e190-40a2-8503-9e4b5b2f65c6"). InnerVolumeSpecName "kube-api-access-wwtmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.332982 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f66bf06-e190-40a2-8503-9e4b5b2f65c6" (UID: "3f66bf06-e190-40a2-8503-9e4b5b2f65c6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.364658 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.384516 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.424848 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.424883 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.424922 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwtmj\" (UniqueName: \"kubernetes.io/projected/3f66bf06-e190-40a2-8503-9e4b5b2f65c6-kube-api-access-wwtmj\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526029 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kube-api-access\") pod \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526127 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kubelet-dir\") pod \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526154 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kubelet-dir\") pod \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\" (UID: \"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526184 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kube-api-access\") pod \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\" (UID: \"3eb322ab-0cf1-448e-8e5a-fbd14f55a267\") " Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526268 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3eb322ab-0cf1-448e-8e5a-fbd14f55a267" (UID: "3eb322ab-0cf1-448e-8e5a-fbd14f55a267"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526310 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3" (UID: "4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526582 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.526595 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.530439 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3eb322ab-0cf1-448e-8e5a-fbd14f55a267" (UID: "3eb322ab-0cf1-448e-8e5a-fbd14f55a267"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.530490 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3" (UID: "4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.575274 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:54 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:54 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:54 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.575524 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.621174 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.621220 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.629214 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb322ab-0cf1-448e-8e5a-fbd14f55a267-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.629257 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.837268 4762 generic.go:334] "Generic (PLEG): container finished" podID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerID="4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03" exitCode=0 Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.837342 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerDied","Data":"4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03"} Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.837374 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerStarted","Data":"8136ff1e3a40df4a9508f1c5626cd8fd8c81c3c67cc8c996271b31f948307289"} Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.842762 4762 generic.go:334] "Generic (PLEG): container finished" podID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerID="9ecff109aa58a217903f0d52a20f142acec4e3dcc4ea14415a3552896acdc421" exitCode=0 Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.842878 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerDied","Data":"9ecff109aa58a217903f0d52a20f142acec4e3dcc4ea14415a3552896acdc421"} Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.845730 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.845779 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3","Type":"ContainerDied","Data":"3cf90fa32a1f32955e3203775d8f2aa9316f8a5cb9131a8cddf0b726dda72928"} Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.845816 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cf90fa32a1f32955e3203775d8f2aa9316f8a5cb9131a8cddf0b726dda72928" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.854067 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3eb322ab-0cf1-448e-8e5a-fbd14f55a267","Type":"ContainerDied","Data":"4de851ed6314ad76d956e9be1dd691fb698c81d1aa4ff3f6a4b683175c5813d7"} Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.854105 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de851ed6314ad76d956e9be1dd691fb698c81d1aa4ff3f6a4b683175c5813d7" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.854216 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.870224 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" event={"ID":"3f66bf06-e190-40a2-8503-9e4b5b2f65c6","Type":"ContainerDied","Data":"0e73ebac43eb08112a89a8fcb17839837bd998e29be38ce59eb17a09f7ff23d0"} Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.870262 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e73ebac43eb08112a89a8fcb17839837bd998e29be38ce59eb17a09f7ff23d0" Feb 17 14:07:54 crc kubenswrapper[4762]: I0217 14:07:54.870317 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj" Feb 17 14:07:55 crc kubenswrapper[4762]: I0217 14:07:55.571443 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:55 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:55 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:55 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:55 crc kubenswrapper[4762]: I0217 14:07:55.571511 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:56 crc kubenswrapper[4762]: I0217 14:07:56.482794 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:56 crc kubenswrapper[4762]: I0217 14:07:56.490948 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fqmtz" Feb 17 14:07:56 crc kubenswrapper[4762]: I0217 14:07:56.583885 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:56 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:56 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:56 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:56 crc kubenswrapper[4762]: I0217 14:07:56.583953 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:57 crc kubenswrapper[4762]: I0217 14:07:57.572739 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:57 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:57 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:57 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:57 crc kubenswrapper[4762]: I0217 14:07:57.573161 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:57 crc kubenswrapper[4762]: I0217 14:07:57.671217 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pl76v" Feb 17 14:07:58 crc kubenswrapper[4762]: I0217 14:07:58.590678 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:58 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:58 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:58 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:58 crc kubenswrapper[4762]: I0217 14:07:58.590736 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:07:59 crc kubenswrapper[4762]: I0217 14:07:59.572611 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:07:59 crc kubenswrapper[4762]: [-]has-synced failed: reason withheld Feb 17 14:07:59 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:07:59 crc kubenswrapper[4762]: healthz check failed Feb 17 14:07:59 crc kubenswrapper[4762]: I0217 14:07:59.572984 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:00 crc kubenswrapper[4762]: I0217 14:08:00.574807 4762 patch_prober.go:28] interesting pod/router-default-5444994796-s9l2w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:00 crc kubenswrapper[4762]: [+]has-synced ok Feb 17 14:08:00 crc kubenswrapper[4762]: [+]process-running ok Feb 17 14:08:00 crc kubenswrapper[4762]: healthz check failed Feb 17 14:08:00 crc kubenswrapper[4762]: I0217 14:08:00.574875 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s9l2w" podUID="af9aff26-c327-4fe9-ba97-e7ab3f453fa2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.160082 4762 patch_prober.go:28] interesting pod/console-f9d7485db-54mm8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.160139 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-54mm8" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.580055 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.582917 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-s9l2w" Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.796619 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.796688 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.796620 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:01 crc kubenswrapper[4762]: I0217 14:08:01.796782 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:02 crc kubenswrapper[4762]: I0217 14:08:02.669259 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:08:02 crc kubenswrapper[4762]: I0217 14:08:02.715749 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63580a98-4d0e-434e-ad09-e7d542e7a5cc-metrics-certs\") pod \"network-metrics-daemon-7v8bf\" (UID: \"63580a98-4d0e-434e-ad09-e7d542e7a5cc\") " pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:08:02 crc kubenswrapper[4762]: I0217 14:08:02.996186 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v8bf" Feb 17 14:08:10 crc kubenswrapper[4762]: I0217 14:08:10.658985 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.247589 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.253116 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.797327 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.797571 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.798119 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.798171 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.798214 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.799290 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.799318 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.799763 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"bb5ead0cd2c070de6bfa921704769b07e22731b60889c4ac40a4b83795f51f28"} pod="openshift-console/downloads-7954f5f757-fc6hb" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 17 14:08:11 crc kubenswrapper[4762]: I0217 14:08:11.799849 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" containerID="cri-o://bb5ead0cd2c070de6bfa921704769b07e22731b60889c4ac40a4b83795f51f28" gracePeriod=2 Feb 17 14:08:13 crc kubenswrapper[4762]: I0217 14:08:13.158961 4762 generic.go:334] "Generic (PLEG): container finished" podID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerID="bb5ead0cd2c070de6bfa921704769b07e22731b60889c4ac40a4b83795f51f28" exitCode=0 Feb 17 14:08:13 crc kubenswrapper[4762]: I0217 14:08:13.159009 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fc6hb" event={"ID":"846c594b-fb0a-4947-bbd4-cf3984892e88","Type":"ContainerDied","Data":"bb5ead0cd2c070de6bfa921704769b07e22731b60889c4ac40a4b83795f51f28"} Feb 17 14:08:21 crc kubenswrapper[4762]: I0217 14:08:21.797416 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:21 crc kubenswrapper[4762]: I0217 14:08:21.798040 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:22 crc kubenswrapper[4762]: I0217 14:08:22.008939 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5jpk8" Feb 17 14:08:24 crc kubenswrapper[4762]: I0217 14:08:24.621608 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:08:24 crc kubenswrapper[4762]: I0217 14:08:24.621989 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:08:26 crc kubenswrapper[4762]: E0217 14:08:26.425902 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 14:08:26 crc kubenswrapper[4762]: E0217 14:08:26.426074 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddm66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-66rsm_openshift-marketplace(8fcc9b44-0a23-4690-8620-ede69e43a7f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:26 crc kubenswrapper[4762]: E0217 14:08:26.427731 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-66rsm" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" Feb 17 14:08:28 crc kubenswrapper[4762]: I0217 14:08:28.306053 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.634457 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:08:29 crc kubenswrapper[4762]: E0217 14:08:29.635250 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3" containerName="pruner" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635263 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3" containerName="pruner" Feb 17 14:08:29 crc kubenswrapper[4762]: E0217 14:08:29.635278 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f66bf06-e190-40a2-8503-9e4b5b2f65c6" containerName="collect-profiles" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635285 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f66bf06-e190-40a2-8503-9e4b5b2f65c6" containerName="collect-profiles" Feb 17 14:08:29 crc kubenswrapper[4762]: E0217 14:08:29.635293 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb322ab-0cf1-448e-8e5a-fbd14f55a267" containerName="pruner" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635299 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb322ab-0cf1-448e-8e5a-fbd14f55a267" containerName="pruner" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635388 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f66bf06-e190-40a2-8503-9e4b5b2f65c6" containerName="collect-profiles" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635396 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb322ab-0cf1-448e-8e5a-fbd14f55a267" containerName="pruner" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635404 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0d5f8e-d00e-4ca6-bb1a-704b5a678ab3" containerName="pruner" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.635819 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.637880 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.638100 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.657346 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.692466 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63257264-bf1c-402c-907f-6bf6a1ce50ea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.692862 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63257264-bf1c-402c-907f-6bf6a1ce50ea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.796725 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63257264-bf1c-402c-907f-6bf6a1ce50ea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.797004 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63257264-bf1c-402c-907f-6bf6a1ce50ea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.797121 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63257264-bf1c-402c-907f-6bf6a1ce50ea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.814497 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63257264-bf1c-402c-907f-6bf6a1ce50ea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:29 crc kubenswrapper[4762]: I0217 14:08:29.957699 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.269025 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-66rsm" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.350974 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.351175 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tt4jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hv4vz_openshift-marketplace(2f1332eb-9672-4d20-b2e4-4d26287d6464): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.352348 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hv4vz" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.370814 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.371039 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mggzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-28cgn_openshift-marketplace(490d6026-4fbb-49b1-993c-09dd3e60db65): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:31 crc kubenswrapper[4762]: E0217 14:08:31.372369 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-28cgn" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" Feb 17 14:08:31 crc kubenswrapper[4762]: I0217 14:08:31.796744 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:31 crc kubenswrapper[4762]: I0217 14:08:31.797027 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.789476 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hv4vz" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.789521 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-28cgn" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.851848 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.852240 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5sqzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lb2z7_openshift-marketplace(23c1ddb0-986c-4801-9172-0f372eebae07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.854191 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lb2z7" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.870984 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.871123 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jsn6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-q7zdn_openshift-marketplace(a1770df5-1061-4617-91ae-3909f5fe514f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:32 crc kubenswrapper[4762]: E0217 14:08:32.872316 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-q7zdn" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.638934 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lb2z7" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.640113 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-q7zdn" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.701830 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.701985 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6t5ld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qpj7t_openshift-marketplace(17efb526-3519-4d99-bd81-cd6fed3a42aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.703289 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qpj7t" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.804123 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.804610 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpjh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5h5kh_openshift-marketplace(ea39a651-661f-4d01-9420-71469f5d2b8c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.806265 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5h5kh" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.904769 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.904949 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8zlnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-j27jc_openshift-marketplace(4505d245-d558-4112-893d-75b19c128b09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4762]: E0217 14:08:34.908715 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-j27jc" podUID="4505d245-d558-4112-893d-75b19c128b09" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.033194 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7v8bf"] Feb 17 14:08:35 crc kubenswrapper[4762]: W0217 14:08:35.039676 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63580a98_4d0e_434e_ad09_e7d542e7a5cc.slice/crio-047b8eb0d9c8027b0259a19c4b302066fd675d73fc0d0782497cf4cd9b88405d WatchSource:0}: Error finding container 047b8eb0d9c8027b0259a19c4b302066fd675d73fc0d0782497cf4cd9b88405d: Status 404 returned error can't find the container with id 047b8eb0d9c8027b0259a19c4b302066fd675d73fc0d0782497cf4cd9b88405d Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.116958 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.296601 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"63257264-bf1c-402c-907f-6bf6a1ce50ea","Type":"ContainerStarted","Data":"972d9dd30491339f2afa9254dca659d7f71f70be99104fda11f130e1a058a3c1"} Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.298399 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" event={"ID":"63580a98-4d0e-434e-ad09-e7d542e7a5cc","Type":"ContainerStarted","Data":"047b8eb0d9c8027b0259a19c4b302066fd675d73fc0d0782497cf4cd9b88405d"} Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.301391 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fc6hb" event={"ID":"846c594b-fb0a-4947-bbd4-cf3984892e88","Type":"ContainerStarted","Data":"258e81069ef3bdc99d375f336ab498854f40c47b444531e1c27c97066cdecbf3"} Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.301476 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:08:35 crc kubenswrapper[4762]: E0217 14:08:35.301593 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-j27jc" podUID="4505d245-d558-4112-893d-75b19c128b09" Feb 17 14:08:35 crc kubenswrapper[4762]: E0217 14:08:35.302983 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qpj7t" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" Feb 17 14:08:35 crc kubenswrapper[4762]: E0217 14:08:35.303407 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5h5kh" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.305538 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.305602 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.424435 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.425876 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.437780 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.563030 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.563404 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-var-lock\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.563450 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kube-api-access\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.664550 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.664623 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-var-lock\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.664732 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kube-api-access\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.664797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-var-lock\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.664725 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.686717 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kube-api-access\") pod \"installer-9-crc\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:35 crc kubenswrapper[4762]: I0217 14:08:35.745086 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.124452 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:08:36 crc kubenswrapper[4762]: W0217 14:08:36.139157 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3ac57045_b522_4701_8c80_c3fdf4aaeb14.slice/crio-69f67b3a2738447595c9393becb90c34919f6df20843e0db152bd0262aa5257c WatchSource:0}: Error finding container 69f67b3a2738447595c9393becb90c34919f6df20843e0db152bd0262aa5257c: Status 404 returned error can't find the container with id 69f67b3a2738447595c9393becb90c34919f6df20843e0db152bd0262aa5257c Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.314451 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3ac57045-b522-4701-8c80-c3fdf4aaeb14","Type":"ContainerStarted","Data":"69f67b3a2738447595c9393becb90c34919f6df20843e0db152bd0262aa5257c"} Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.315475 4762 generic.go:334] "Generic (PLEG): container finished" podID="63257264-bf1c-402c-907f-6bf6a1ce50ea" containerID="3a8acd77517ba7faf2fda4a118cb38da446c86b4fda2345eec458e3cc84cf74e" exitCode=0 Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.315936 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"63257264-bf1c-402c-907f-6bf6a1ce50ea","Type":"ContainerDied","Data":"3a8acd77517ba7faf2fda4a118cb38da446c86b4fda2345eec458e3cc84cf74e"} Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.319906 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" event={"ID":"63580a98-4d0e-434e-ad09-e7d542e7a5cc","Type":"ContainerStarted","Data":"5fb0f34c1d958f109ad7f6515f87bbb8041329e0fa6d9803d24193a83f0d5c6e"} Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.319937 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7v8bf" event={"ID":"63580a98-4d0e-434e-ad09-e7d542e7a5cc","Type":"ContainerStarted","Data":"72ade46f3a9a60132c078ab8167b01190e7e4bf3bb32b7e87dd2915c3325d698"} Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.320321 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.320349 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:36 crc kubenswrapper[4762]: I0217 14:08:36.350747 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7v8bf" podStartSLOduration=176.350724759 podStartE2EDuration="2m56.350724759s" podCreationTimestamp="2026-02-17 14:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:36.347026386 +0000 UTC m=+196.927027048" watchObservedRunningTime="2026-02-17 14:08:36.350724759 +0000 UTC m=+196.930725421" Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.325794 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3ac57045-b522-4701-8c80-c3fdf4aaeb14","Type":"ContainerStarted","Data":"29722c15b564b6372b5038cc5ecbc8c02741c2426af9f180271672554b128e25"} Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.341671 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.341635485 podStartE2EDuration="2.341635485s" podCreationTimestamp="2026-02-17 14:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:37.338262012 +0000 UTC m=+197.918262684" watchObservedRunningTime="2026-02-17 14:08:37.341635485 +0000 UTC m=+197.921636137" Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.648982 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.687810 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63257264-bf1c-402c-907f-6bf6a1ce50ea-kube-api-access\") pod \"63257264-bf1c-402c-907f-6bf6a1ce50ea\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.687891 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63257264-bf1c-402c-907f-6bf6a1ce50ea-kubelet-dir\") pod \"63257264-bf1c-402c-907f-6bf6a1ce50ea\" (UID: \"63257264-bf1c-402c-907f-6bf6a1ce50ea\") " Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.688145 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63257264-bf1c-402c-907f-6bf6a1ce50ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "63257264-bf1c-402c-907f-6bf6a1ce50ea" (UID: "63257264-bf1c-402c-907f-6bf6a1ce50ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.700072 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63257264-bf1c-402c-907f-6bf6a1ce50ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "63257264-bf1c-402c-907f-6bf6a1ce50ea" (UID: "63257264-bf1c-402c-907f-6bf6a1ce50ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.789519 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63257264-bf1c-402c-907f-6bf6a1ce50ea-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:37 crc kubenswrapper[4762]: I0217 14:08:37.789555 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63257264-bf1c-402c-907f-6bf6a1ce50ea-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:38 crc kubenswrapper[4762]: I0217 14:08:38.331893 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:08:38 crc kubenswrapper[4762]: I0217 14:08:38.332339 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"63257264-bf1c-402c-907f-6bf6a1ce50ea","Type":"ContainerDied","Data":"972d9dd30491339f2afa9254dca659d7f71f70be99104fda11f130e1a058a3c1"} Feb 17 14:08:38 crc kubenswrapper[4762]: I0217 14:08:38.332368 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="972d9dd30491339f2afa9254dca659d7f71f70be99104fda11f130e1a058a3c1" Feb 17 14:08:40 crc kubenswrapper[4762]: I0217 14:08:40.923974 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-phpw5"] Feb 17 14:08:41 crc kubenswrapper[4762]: I0217 14:08:41.797964 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:41 crc kubenswrapper[4762]: I0217 14:08:41.798017 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:41 crc kubenswrapper[4762]: I0217 14:08:41.797969 4762 patch_prober.go:28] interesting pod/downloads-7954f5f757-fc6hb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 17 14:08:41 crc kubenswrapper[4762]: I0217 14:08:41.798459 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fc6hb" podUID="846c594b-fb0a-4947-bbd4-cf3984892e88" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 17 14:08:44 crc kubenswrapper[4762]: I0217 14:08:44.555905 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerStarted","Data":"171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462"} Feb 17 14:08:46 crc kubenswrapper[4762]: I0217 14:08:46.569709 4762 generic.go:334] "Generic (PLEG): container finished" podID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerID="171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462" exitCode=0 Feb 17 14:08:46 crc kubenswrapper[4762]: I0217 14:08:46.569752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerDied","Data":"171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462"} Feb 17 14:08:47 crc kubenswrapper[4762]: I0217 14:08:47.591139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerStarted","Data":"62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c"} Feb 17 14:08:51 crc kubenswrapper[4762]: I0217 14:08:51.616416 4762 generic.go:334] "Generic (PLEG): container finished" podID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerID="62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4762]: I0217 14:08:51.616490 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerDied","Data":"62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c"} Feb 17 14:08:51 crc kubenswrapper[4762]: I0217 14:08:51.818297 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-fc6hb" Feb 17 14:08:54 crc kubenswrapper[4762]: I0217 14:08:54.621270 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:08:54 crc kubenswrapper[4762]: I0217 14:08:54.621683 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:08:54 crc kubenswrapper[4762]: I0217 14:08:54.621742 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:08:54 crc kubenswrapper[4762]: I0217 14:08:54.622412 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:08:54 crc kubenswrapper[4762]: I0217 14:08:54.622479 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5" gracePeriod=600 Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.821731 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerStarted","Data":"75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.825328 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerStarted","Data":"dffdf1b369e5e57cd2eddd1e31fcfc7853467ca7cbac06acb97d54866e17738a"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.827819 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerStarted","Data":"2490c7b9ab2f1f553722df509e44c8d2bb12bbe29fa6b51a4b64addb84ea43fd"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.830760 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerStarted","Data":"86710bb5aafd789e3f8fffcae0fcafc14bfefc204b8dc7713dd0ed34f0b475d7"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.843104 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerStarted","Data":"e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.846382 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5" exitCode=0 Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.846425 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.846440 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.849702 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerStarted","Data":"ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88"} Feb 17 14:08:55 crc kubenswrapper[4762]: I0217 14:08:55.851850 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-66rsm" podStartSLOduration=3.3274629989999998 podStartE2EDuration="1m6.851835288s" podCreationTimestamp="2026-02-17 14:07:49 +0000 UTC" firstStartedPulling="2026-02-17 14:07:51.698552951 +0000 UTC m=+152.278553593" lastFinishedPulling="2026-02-17 14:08:55.2229252 +0000 UTC m=+215.802925882" observedRunningTime="2026-02-17 14:08:55.850307225 +0000 UTC m=+216.430307877" watchObservedRunningTime="2026-02-17 14:08:55.851835288 +0000 UTC m=+216.431835940" Feb 17 14:08:58 crc kubenswrapper[4762]: I0217 14:08:58.060301 4762 generic.go:334] "Generic (PLEG): container finished" podID="23c1ddb0-986c-4801-9172-0f372eebae07" containerID="86710bb5aafd789e3f8fffcae0fcafc14bfefc204b8dc7713dd0ed34f0b475d7" exitCode=0 Feb 17 14:08:58 crc kubenswrapper[4762]: I0217 14:08:58.060478 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerDied","Data":"86710bb5aafd789e3f8fffcae0fcafc14bfefc204b8dc7713dd0ed34f0b475d7"} Feb 17 14:08:58 crc kubenswrapper[4762]: I0217 14:08:58.065473 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerStarted","Data":"7f9dc20df7254a2d47c8b057031e67b139b4594ce641f4922ffb9d61fbb61c8d"} Feb 17 14:08:58 crc kubenswrapper[4762]: I0217 14:08:58.069236 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerStarted","Data":"e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3"} Feb 17 14:08:58 crc kubenswrapper[4762]: I0217 14:08:58.275348 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hv4vz" podStartSLOduration=3.516067775 podStartE2EDuration="1m5.275326709s" podCreationTimestamp="2026-02-17 14:07:53 +0000 UTC" firstStartedPulling="2026-02-17 14:07:54.840574306 +0000 UTC m=+155.420574958" lastFinishedPulling="2026-02-17 14:08:56.59983324 +0000 UTC m=+217.179833892" observedRunningTime="2026-02-17 14:08:58.148290633 +0000 UTC m=+218.728291305" watchObservedRunningTime="2026-02-17 14:08:58.275326709 +0000 UTC m=+218.855327361" Feb 17 14:08:59 crc kubenswrapper[4762]: I0217 14:08:59.077164 4762 generic.go:334] "Generic (PLEG): container finished" podID="4505d245-d558-4112-893d-75b19c128b09" containerID="ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88" exitCode=0 Feb 17 14:08:59 crc kubenswrapper[4762]: I0217 14:08:59.077460 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerDied","Data":"ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88"} Feb 17 14:08:59 crc kubenswrapper[4762]: I0217 14:08:59.079974 4762 generic.go:334] "Generic (PLEG): container finished" podID="a1770df5-1061-4617-91ae-3909f5fe514f" containerID="2490c7b9ab2f1f553722df509e44c8d2bb12bbe29fa6b51a4b64addb84ea43fd" exitCode=0 Feb 17 14:08:59 crc kubenswrapper[4762]: I0217 14:08:59.080002 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerDied","Data":"2490c7b9ab2f1f553722df509e44c8d2bb12bbe29fa6b51a4b64addb84ea43fd"} Feb 17 14:09:00 crc kubenswrapper[4762]: I0217 14:09:00.100535 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerStarted","Data":"2f762ef10cb4bf7ed4d53f849ab8cb444bb18752a7e7dc38fb4e587d464d0322"} Feb 17 14:09:00 crc kubenswrapper[4762]: I0217 14:09:00.105005 4762 generic.go:334] "Generic (PLEG): container finished" podID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerID="e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc" exitCode=0 Feb 17 14:09:00 crc kubenswrapper[4762]: I0217 14:09:00.105058 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerDied","Data":"e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc"} Feb 17 14:09:00 crc kubenswrapper[4762]: I0217 14:09:00.135796 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lb2z7" podStartSLOduration=3.2010069469999998 podStartE2EDuration="1m9.13576919s" podCreationTimestamp="2026-02-17 14:07:51 +0000 UTC" firstStartedPulling="2026-02-17 14:07:52.789751105 +0000 UTC m=+153.369751757" lastFinishedPulling="2026-02-17 14:08:58.724513348 +0000 UTC m=+219.304514000" observedRunningTime="2026-02-17 14:09:00.131317967 +0000 UTC m=+220.711318639" watchObservedRunningTime="2026-02-17 14:09:00.13576919 +0000 UTC m=+220.715769852" Feb 17 14:09:00 crc kubenswrapper[4762]: I0217 14:09:00.329660 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:09:00 crc kubenswrapper[4762]: I0217 14:09:00.329703 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:09:01 crc kubenswrapper[4762]: I0217 14:09:01.944106 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:09:01 crc kubenswrapper[4762]: I0217 14:09:01.946063 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:09:02 crc kubenswrapper[4762]: E0217 14:09:02.094730 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea39a651_661f_4d01_9420_71469f5d2b8c.slice/crio-7f9dc20df7254a2d47c8b057031e67b139b4594ce641f4922ffb9d61fbb61c8d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea39a651_661f_4d01_9420_71469f5d2b8c.slice/crio-conmon-7f9dc20df7254a2d47c8b057031e67b139b4594ce641f4922ffb9d61fbb61c8d.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:09:02 crc kubenswrapper[4762]: I0217 14:09:02.345619 4762 generic.go:334] "Generic (PLEG): container finished" podID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerID="7f9dc20df7254a2d47c8b057031e67b139b4594ce641f4922ffb9d61fbb61c8d" exitCode=0 Feb 17 14:09:02 crc kubenswrapper[4762]: I0217 14:09:02.345736 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerDied","Data":"7f9dc20df7254a2d47c8b057031e67b139b4594ce641f4922ffb9d61fbb61c8d"} Feb 17 14:09:02 crc kubenswrapper[4762]: I0217 14:09:02.369310 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerStarted","Data":"a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9"} Feb 17 14:09:02 crc kubenswrapper[4762]: I0217 14:09:02.373694 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerStarted","Data":"74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7"} Feb 17 14:09:02 crc kubenswrapper[4762]: I0217 14:09:02.644554 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q7zdn" podStartSLOduration=4.634572192 podStartE2EDuration="1m10.644536109s" podCreationTimestamp="2026-02-17 14:07:52 +0000 UTC" firstStartedPulling="2026-02-17 14:07:53.853040302 +0000 UTC m=+154.433040954" lastFinishedPulling="2026-02-17 14:08:59.863004219 +0000 UTC m=+220.443004871" observedRunningTime="2026-02-17 14:09:02.643011027 +0000 UTC m=+223.223011689" watchObservedRunningTime="2026-02-17 14:09:02.644536109 +0000 UTC m=+223.224536761" Feb 17 14:09:02 crc kubenswrapper[4762]: I0217 14:09:02.646039 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j27jc" podStartSLOduration=4.535336626 podStartE2EDuration="1m12.646031311s" podCreationTimestamp="2026-02-17 14:07:50 +0000 UTC" firstStartedPulling="2026-02-17 14:07:51.696115104 +0000 UTC m=+152.276115756" lastFinishedPulling="2026-02-17 14:08:59.806809789 +0000 UTC m=+220.386810441" observedRunningTime="2026-02-17 14:09:02.424592444 +0000 UTC m=+223.004593096" watchObservedRunningTime="2026-02-17 14:09:02.646031311 +0000 UTC m=+223.226031963" Feb 17 14:09:03 crc kubenswrapper[4762]: I0217 14:09:03.301058 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-lb2z7" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="registry-server" probeResult="failure" output=< Feb 17 14:09:03 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:09:03 crc kubenswrapper[4762]: > Feb 17 14:09:03 crc kubenswrapper[4762]: I0217 14:09:03.307207 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-66rsm" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="registry-server" probeResult="failure" output=< Feb 17 14:09:03 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:09:03 crc kubenswrapper[4762]: > Feb 17 14:09:03 crc kubenswrapper[4762]: I0217 14:09:03.421331 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:09:03 crc kubenswrapper[4762]: I0217 14:09:03.421393 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:09:04 crc kubenswrapper[4762]: I0217 14:09:04.503357 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hv4vz" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="registry-server" probeResult="failure" output=< Feb 17 14:09:04 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:09:04 crc kubenswrapper[4762]: > Feb 17 14:09:05 crc kubenswrapper[4762]: I0217 14:09:05.400941 4762 generic.go:334] "Generic (PLEG): container finished" podID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerID="dffdf1b369e5e57cd2eddd1e31fcfc7853467ca7cbac06acb97d54866e17738a" exitCode=0 Feb 17 14:09:05 crc kubenswrapper[4762]: I0217 14:09:05.401012 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerDied","Data":"dffdf1b369e5e57cd2eddd1e31fcfc7853467ca7cbac06acb97d54866e17738a"} Feb 17 14:09:05 crc kubenswrapper[4762]: I0217 14:09:05.984545 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" podUID="02adf3f5-bd74-409a-8942-f77cba830901" containerName="oauth-openshift" containerID="cri-o://85f0e973c0b0d46ffbd369f16c8e1a79167e710ec487da7fc4491673c2138db3" gracePeriod=15 Feb 17 14:09:06 crc kubenswrapper[4762]: I0217 14:09:06.409334 4762 generic.go:334] "Generic (PLEG): container finished" podID="02adf3f5-bd74-409a-8942-f77cba830901" containerID="85f0e973c0b0d46ffbd369f16c8e1a79167e710ec487da7fc4491673c2138db3" exitCode=0 Feb 17 14:09:06 crc kubenswrapper[4762]: I0217 14:09:06.409409 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" event={"ID":"02adf3f5-bd74-409a-8942-f77cba830901","Type":"ContainerDied","Data":"85f0e973c0b0d46ffbd369f16c8e1a79167e710ec487da7fc4491673c2138db3"} Feb 17 14:09:06 crc kubenswrapper[4762]: I0217 14:09:06.411653 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerStarted","Data":"b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f"} Feb 17 14:09:06 crc kubenswrapper[4762]: I0217 14:09:06.415360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerStarted","Data":"d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae"} Feb 17 14:09:06 crc kubenswrapper[4762]: I0217 14:09:06.443273 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5h5kh" podStartSLOduration=3.515127958 podStartE2EDuration="1m17.443252805s" podCreationTimestamp="2026-02-17 14:07:49 +0000 UTC" firstStartedPulling="2026-02-17 14:07:51.69240017 +0000 UTC m=+152.272400822" lastFinishedPulling="2026-02-17 14:09:05.620525027 +0000 UTC m=+226.200525669" observedRunningTime="2026-02-17 14:09:06.428170656 +0000 UTC m=+227.008171308" watchObservedRunningTime="2026-02-17 14:09:06.443252805 +0000 UTC m=+227.023253457" Feb 17 14:09:06 crc kubenswrapper[4762]: I0217 14:09:06.444454 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpj7t" podStartSLOduration=4.928233704 podStartE2EDuration="1m17.444446278s" podCreationTimestamp="2026-02-17 14:07:49 +0000 UTC" firstStartedPulling="2026-02-17 14:07:51.693998335 +0000 UTC m=+152.273998987" lastFinishedPulling="2026-02-17 14:09:04.210210909 +0000 UTC m=+224.790211561" observedRunningTime="2026-02-17 14:09:06.441832826 +0000 UTC m=+227.021833478" watchObservedRunningTime="2026-02-17 14:09:06.444446278 +0000 UTC m=+227.024446930" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.423904 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerStarted","Data":"ed81fdd85e7cb910429f3cf771061c13a5cc19be1f4cd90b321c2d48e0b4e9c1"} Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.459430 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-28cgn" podStartSLOduration=3.351041056 podStartE2EDuration="1m15.459408421s" podCreationTimestamp="2026-02-17 14:07:52 +0000 UTC" firstStartedPulling="2026-02-17 14:07:54.851116769 +0000 UTC m=+155.431117421" lastFinishedPulling="2026-02-17 14:09:06.959484134 +0000 UTC m=+227.539484786" observedRunningTime="2026-02-17 14:09:07.458507416 +0000 UTC m=+228.038508068" watchObservedRunningTime="2026-02-17 14:09:07.459408421 +0000 UTC m=+228.039409073" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.492722 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.518684 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-89w45"] Feb 17 14:09:07 crc kubenswrapper[4762]: E0217 14:09:07.518948 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02adf3f5-bd74-409a-8942-f77cba830901" containerName="oauth-openshift" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.518965 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="02adf3f5-bd74-409a-8942-f77cba830901" containerName="oauth-openshift" Feb 17 14:09:07 crc kubenswrapper[4762]: E0217 14:09:07.518978 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63257264-bf1c-402c-907f-6bf6a1ce50ea" containerName="pruner" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.518985 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="63257264-bf1c-402c-907f-6bf6a1ce50ea" containerName="pruner" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.519113 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="02adf3f5-bd74-409a-8942-f77cba830901" containerName="oauth-openshift" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.519131 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="63257264-bf1c-402c-907f-6bf6a1ce50ea" containerName="pruner" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.519602 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.545390 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-89w45"] Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680462 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-audit-policies\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680767 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z25qq\" (UniqueName: \"kubernetes.io/projected/02adf3f5-bd74-409a-8942-f77cba830901-kube-api-access-z25qq\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680793 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-cliconfig\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680815 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-service-ca\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680849 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-provider-selection\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680870 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02adf3f5-bd74-409a-8942-f77cba830901-audit-dir\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680915 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-session\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680955 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-serving-cert\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.680991 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-idp-0-file-data\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681021 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-router-certs\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681040 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-ocp-branding-template\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681064 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-login\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681093 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-trusted-ca-bundle\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681126 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-error\") pod \"02adf3f5-bd74-409a-8942-f77cba830901\" (UID: \"02adf3f5-bd74-409a-8942-f77cba830901\") " Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681199 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681278 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681305 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681324 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64tjl\" (UniqueName: \"kubernetes.io/projected/1f74db19-5919-4499-94ee-2ff89ac79cef-kube-api-access-64tjl\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681355 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681375 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681398 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f74db19-5919-4499-94ee-2ff89ac79cef-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681453 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681470 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681498 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681518 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681537 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681593 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681349 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681376 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681765 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02adf3f5-bd74-409a-8942-f77cba830901-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.681941 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.687844 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.688336 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.688825 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.689191 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02adf3f5-bd74-409a-8942-f77cba830901-kube-api-access-z25qq" (OuterVolumeSpecName: "kube-api-access-z25qq") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "kube-api-access-z25qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.700716 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.700903 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.703113 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.705094 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.705552 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "02adf3f5-bd74-409a-8942-f77cba830901" (UID: "02adf3f5-bd74-409a-8942-f77cba830901"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782442 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782545 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782597 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782627 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782665 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782690 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64tjl\" (UniqueName: \"kubernetes.io/projected/1f74db19-5919-4499-94ee-2ff89ac79cef-kube-api-access-64tjl\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782717 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782776 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782804 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782839 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f74db19-5919-4499-94ee-2ff89ac79cef-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782883 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782919 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782967 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782984 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.782997 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783010 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783024 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783039 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783052 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783066 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z25qq\" (UniqueName: \"kubernetes.io/projected/02adf3f5-bd74-409a-8942-f77cba830901-kube-api-access-z25qq\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783081 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783094 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783107 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783122 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02adf3f5-bd74-409a-8942-f77cba830901-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783133 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02adf3f5-bd74-409a-8942-f77cba830901-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783626 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783630 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783696 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.783707 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f74db19-5919-4499-94ee-2ff89ac79cef-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.784223 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f74db19-5919-4499-94ee-2ff89ac79cef-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.785675 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.786055 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.786080 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.786888 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.787911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.788285 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.788317 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.788689 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f74db19-5919-4499-94ee-2ff89ac79cef-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.808096 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64tjl\" (UniqueName: \"kubernetes.io/projected/1f74db19-5919-4499-94ee-2ff89ac79cef-kube-api-access-64tjl\") pod \"oauth-openshift-9bc7b6b6b-89w45\" (UID: \"1f74db19-5919-4499-94ee-2ff89ac79cef\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:07 crc kubenswrapper[4762]: I0217 14:09:07.835277 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.199681 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-89w45"] Feb 17 14:09:08 crc kubenswrapper[4762]: W0217 14:09:08.206929 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f74db19_5919_4499_94ee_2ff89ac79cef.slice/crio-7b57443050e4500cee1b7aa50da41e118a0cb10fe5c156edc974191b40ec187a WatchSource:0}: Error finding container 7b57443050e4500cee1b7aa50da41e118a0cb10fe5c156edc974191b40ec187a: Status 404 returned error can't find the container with id 7b57443050e4500cee1b7aa50da41e118a0cb10fe5c156edc974191b40ec187a Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.431757 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" event={"ID":"1f74db19-5919-4499-94ee-2ff89ac79cef","Type":"ContainerStarted","Data":"7b57443050e4500cee1b7aa50da41e118a0cb10fe5c156edc974191b40ec187a"} Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.433684 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" event={"ID":"02adf3f5-bd74-409a-8942-f77cba830901","Type":"ContainerDied","Data":"439f97fd81cf77e412e0dacf2e7be27738b5a58642ae8b87fd6a21ae4ba02ba1"} Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.433728 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-phpw5" Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.433749 4762 scope.go:117] "RemoveContainer" containerID="85f0e973c0b0d46ffbd369f16c8e1a79167e710ec487da7fc4491673c2138db3" Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.461281 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-phpw5"] Feb 17 14:09:08 crc kubenswrapper[4762]: I0217 14:09:08.465244 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-phpw5"] Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.440219 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" event={"ID":"1f74db19-5919-4499-94ee-2ff89ac79cef","Type":"ContainerStarted","Data":"e6cbcab3a9892bb41fc13cd3fb9d18aa4f59b993a837b433d5aec627c3ab70c8"} Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.441250 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.449737 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.464309 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-89w45" podStartSLOduration=29.464286944 podStartE2EDuration="29.464286944s" podCreationTimestamp="2026-02-17 14:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:09:09.461377803 +0000 UTC m=+230.041378475" watchObservedRunningTime="2026-02-17 14:09:09.464286944 +0000 UTC m=+230.044287596" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.771219 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.771270 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.816897 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.969019 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:09:09 crc kubenswrapper[4762]: I0217 14:09:09.969095 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.009104 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.078899 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02adf3f5-bd74-409a-8942-f77cba830901" path="/var/lib/kubelet/pods/02adf3f5-bd74-409a-8942-f77cba830901/volumes" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.226769 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.265661 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.381591 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.381658 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.417777 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.490401 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.495288 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:09:10 crc kubenswrapper[4762]: I0217 14:09:10.501599 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:09:11 crc kubenswrapper[4762]: I0217 14:09:11.983562 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.042779 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.353600 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.353693 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.407544 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.446991 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66rsm"] Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.447328 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-66rsm" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="registry-server" containerID="cri-o://75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1" gracePeriod=2 Feb 17 14:09:12 crc kubenswrapper[4762]: I0217 14:09:12.499044 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.008376 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.008723 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.320217 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.401856 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j27jc"] Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.402124 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j27jc" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="registry-server" containerID="cri-o://a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9" gracePeriod=2 Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.464596 4762 generic.go:334] "Generic (PLEG): container finished" podID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerID="75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1" exitCode=0 Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.464731 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66rsm" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.464714 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerDied","Data":"75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1"} Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.464775 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66rsm" event={"ID":"8fcc9b44-0a23-4690-8620-ede69e43a7f4","Type":"ContainerDied","Data":"244f21222bf6a06cdf751507bfeb4bbf88c40e93bd7c7e7f71473ef2b7812688"} Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.464800 4762 scope.go:117] "RemoveContainer" containerID="75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.465051 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-catalog-content\") pod \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.465218 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddm66\" (UniqueName: \"kubernetes.io/projected/8fcc9b44-0a23-4690-8620-ede69e43a7f4-kube-api-access-ddm66\") pod \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.465369 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-utilities\") pod \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\" (UID: \"8fcc9b44-0a23-4690-8620-ede69e43a7f4\") " Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.466789 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-utilities" (OuterVolumeSpecName: "utilities") pod "8fcc9b44-0a23-4690-8620-ede69e43a7f4" (UID: "8fcc9b44-0a23-4690-8620-ede69e43a7f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.468265 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.470426 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fcc9b44-0a23-4690-8620-ede69e43a7f4-kube-api-access-ddm66" (OuterVolumeSpecName: "kube-api-access-ddm66") pod "8fcc9b44-0a23-4690-8620-ede69e43a7f4" (UID: "8fcc9b44-0a23-4690-8620-ede69e43a7f4"). InnerVolumeSpecName "kube-api-access-ddm66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.515814 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fcc9b44-0a23-4690-8620-ede69e43a7f4" (UID: "8fcc9b44-0a23-4690-8620-ede69e43a7f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.517011 4762 scope.go:117] "RemoveContainer" containerID="171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.518356 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.541792 4762 scope.go:117] "RemoveContainer" containerID="0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.562242 4762 scope.go:117] "RemoveContainer" containerID="75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1" Feb 17 14:09:13 crc kubenswrapper[4762]: E0217 14:09:13.562744 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1\": container with ID starting with 75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1 not found: ID does not exist" containerID="75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.562780 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1"} err="failed to get container status \"75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1\": rpc error: code = NotFound desc = could not find container \"75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1\": container with ID starting with 75924d6776f9151d76926aa0eb6292c1e2d3dc9cd0c328f493c88e5cb3651ce1 not found: ID does not exist" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.562807 4762 scope.go:117] "RemoveContainer" containerID="171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462" Feb 17 14:09:13 crc kubenswrapper[4762]: E0217 14:09:13.563189 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462\": container with ID starting with 171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462 not found: ID does not exist" containerID="171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.563207 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462"} err="failed to get container status \"171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462\": rpc error: code = NotFound desc = could not find container \"171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462\": container with ID starting with 171b3f995a0fce0fd9675f2eefa065fa2e36d8934130bc381642ee51513c3462 not found: ID does not exist" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.563219 4762 scope.go:117] "RemoveContainer" containerID="0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31" Feb 17 14:09:13 crc kubenswrapper[4762]: E0217 14:09:13.563418 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31\": container with ID starting with 0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31 not found: ID does not exist" containerID="0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.563432 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31"} err="failed to get container status \"0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31\": rpc error: code = NotFound desc = could not find container \"0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31\": container with ID starting with 0570fd29ac909fc841ab6ff35f604b3814b234b8e859957bba39d6d0b73cde31 not found: ID does not exist" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.566412 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddm66\" (UniqueName: \"kubernetes.io/projected/8fcc9b44-0a23-4690-8620-ede69e43a7f4-kube-api-access-ddm66\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.566434 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.566443 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fcc9b44-0a23-4690-8620-ede69e43a7f4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.745576 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.793887 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66rsm"] Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.796609 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-66rsm"] Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.869729 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zlnm\" (UniqueName: \"kubernetes.io/projected/4505d245-d558-4112-893d-75b19c128b09-kube-api-access-8zlnm\") pod \"4505d245-d558-4112-893d-75b19c128b09\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.869887 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-utilities\") pod \"4505d245-d558-4112-893d-75b19c128b09\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.869935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-catalog-content\") pod \"4505d245-d558-4112-893d-75b19c128b09\" (UID: \"4505d245-d558-4112-893d-75b19c128b09\") " Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.872579 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-utilities" (OuterVolumeSpecName: "utilities") pod "4505d245-d558-4112-893d-75b19c128b09" (UID: "4505d245-d558-4112-893d-75b19c128b09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.873812 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4505d245-d558-4112-893d-75b19c128b09-kube-api-access-8zlnm" (OuterVolumeSpecName: "kube-api-access-8zlnm") pod "4505d245-d558-4112-893d-75b19c128b09" (UID: "4505d245-d558-4112-893d-75b19c128b09"). InnerVolumeSpecName "kube-api-access-8zlnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.924700 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4505d245-d558-4112-893d-75b19c128b09" (UID: "4505d245-d558-4112-893d-75b19c128b09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.971496 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zlnm\" (UniqueName: \"kubernetes.io/projected/4505d245-d558-4112-893d-75b19c128b09-kube-api-access-8zlnm\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.971534 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:13 crc kubenswrapper[4762]: I0217 14:09:13.971547 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4505d245-d558-4112-893d-75b19c128b09-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.029860 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.030146 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="extract-utilities" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030161 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="extract-utilities" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.030177 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="registry-server" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030185 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="registry-server" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.030196 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="extract-content" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030204 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="extract-content" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.030226 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="registry-server" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030234 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="registry-server" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.030246 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="extract-content" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030253 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="extract-content" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.030263 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="extract-utilities" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030269 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="extract-utilities" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030380 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4505d245-d558-4112-893d-75b19c128b09" containerName="registry-server" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030394 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" containerName="registry-server" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.030851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.031231 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.031582 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a" gracePeriod=15 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.031688 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1" gracePeriod=15 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.031744 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310" gracePeriod=15 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.031798 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee" gracePeriod=15 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.032299 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d" gracePeriod=15 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033067 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033379 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033409 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033427 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033438 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033454 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033465 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033486 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033496 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033511 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033522 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033539 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033550 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033768 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033786 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033800 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033812 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033826 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.033840 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.033998 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.034018 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.052832 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-28cgn" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="registry-server" probeResult="failure" output=< Feb 17 14:09:14 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:09:14 crc kubenswrapper[4762]: > Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.117237 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fcc9b44-0a23-4690-8620-ede69e43a7f4" path="/var/lib/kubelet/pods/8fcc9b44-0a23-4690-8620-ede69e43a7f4/volumes" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.210471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.210570 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.210613 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.210633 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.210723 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.210950 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.211033 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.211120 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311772 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311853 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311883 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311890 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311901 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311931 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311965 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.312005 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.312096 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.312010 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.312168 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.312176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.311965 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.312214 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.472481 4762 generic.go:334] "Generic (PLEG): container finished" podID="4505d245-d558-4112-893d-75b19c128b09" containerID="a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9" exitCode=0 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.472541 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j27jc" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.472555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerDied","Data":"a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9"} Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.472583 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j27jc" event={"ID":"4505d245-d558-4112-893d-75b19c128b09","Type":"ContainerDied","Data":"17942061e8fb438a8e5ca86f8e63a1e22bd2d2eca4d345272307d11046eca8a8"} Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.472599 4762 scope.go:117] "RemoveContainer" containerID="a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.473274 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.475862 4762 generic.go:334] "Generic (PLEG): container finished" podID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" containerID="29722c15b564b6372b5038cc5ecbc8c02741c2426af9f180271672554b128e25" exitCode=0 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.475938 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3ac57045-b522-4701-8c80-c3fdf4aaeb14","Type":"ContainerDied","Data":"29722c15b564b6372b5038cc5ecbc8c02741c2426af9f180271672554b128e25"} Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.476491 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.476838 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.478129 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.479061 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.479282 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.479321 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.479993 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d" exitCode=0 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.480017 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee" exitCode=0 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.480026 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1" exitCode=0 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.480033 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310" exitCode=2 Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.489216 4762 scope.go:117] "RemoveContainer" containerID="ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.502878 4762 scope.go:117] "RemoveContainer" containerID="b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.517481 4762 scope.go:117] "RemoveContainer" containerID="a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.517999 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9\": container with ID starting with a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9 not found: ID does not exist" containerID="a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.518030 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9"} err="failed to get container status \"a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9\": rpc error: code = NotFound desc = could not find container \"a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9\": container with ID starting with a0184b4b61cf59151898a471e738843650bb7f71fe06d01a98fc95c86cbac1c9 not found: ID does not exist" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.518052 4762 scope.go:117] "RemoveContainer" containerID="ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.518449 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88\": container with ID starting with ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88 not found: ID does not exist" containerID="ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.518487 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88"} err="failed to get container status \"ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88\": rpc error: code = NotFound desc = could not find container \"ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88\": container with ID starting with ad4f94137f165990a61ce51cc49c3d3a90a1459f7c768aa62e3fb251f633fc88 not found: ID does not exist" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.518513 4762 scope.go:117] "RemoveContainer" containerID="b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77" Feb 17 14:09:14 crc kubenswrapper[4762]: E0217 14:09:14.519062 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77\": container with ID starting with b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77 not found: ID does not exist" containerID="b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.519090 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77"} err="failed to get container status \"b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77\": rpc error: code = NotFound desc = could not find container \"b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77\": container with ID starting with b88e20b3be9613f513e7421ff31ad02dc22e4b1c8234d741b7d60df579ea3d77 not found: ID does not exist" Feb 17 14:09:14 crc kubenswrapper[4762]: I0217 14:09:14.519105 4762 scope.go:117] "RemoveContainer" containerID="104fbb6d136628d09d7aed026a4077947c83c231cb4c4b69e6d054f64c24114a" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.489841 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.734056 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.734562 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.735052 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932130 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kubelet-dir\") pod \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932194 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kube-api-access\") pod \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932212 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-var-lock\") pod \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\" (UID: \"3ac57045-b522-4701-8c80-c3fdf4aaeb14\") " Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3ac57045-b522-4701-8c80-c3fdf4aaeb14" (UID: "3ac57045-b522-4701-8c80-c3fdf4aaeb14"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932432 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-var-lock" (OuterVolumeSpecName: "var-lock") pod "3ac57045-b522-4701-8c80-c3fdf4aaeb14" (UID: "3ac57045-b522-4701-8c80-c3fdf4aaeb14"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932584 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.932597 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3ac57045-b522-4701-8c80-c3fdf4aaeb14-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:15 crc kubenswrapper[4762]: I0217 14:09:15.937035 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3ac57045-b522-4701-8c80-c3fdf4aaeb14" (UID: "3ac57045-b522-4701-8c80-c3fdf4aaeb14"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.033578 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ac57045-b522-4701-8c80-c3fdf4aaeb14-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.397923 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.399014 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.399702 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.400177 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.400466 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.498436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3ac57045-b522-4701-8c80-c3fdf4aaeb14","Type":"ContainerDied","Data":"69f67b3a2738447595c9393becb90c34919f6df20843e0db152bd0262aa5257c"} Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.498486 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f67b3a2738447595c9393becb90c34919f6df20843e0db152bd0262aa5257c" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.498535 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.500778 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.501719 4762 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a" exitCode=0 Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.501810 4762 scope.go:117] "RemoveContainer" containerID="39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.501830 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.504881 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.505093 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.505318 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.517901 4762 scope.go:117] "RemoveContainer" containerID="ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.529516 4762 scope.go:117] "RemoveContainer" containerID="c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539032 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539077 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539088 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539095 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539113 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539132 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539373 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539387 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.539397 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.540026 4762 scope.go:117] "RemoveContainer" containerID="11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.550250 4762 scope.go:117] "RemoveContainer" containerID="4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.561826 4762 scope.go:117] "RemoveContainer" containerID="fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.576918 4762 scope.go:117] "RemoveContainer" containerID="39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d" Feb 17 14:09:16 crc kubenswrapper[4762]: E0217 14:09:16.577316 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\": container with ID starting with 39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d not found: ID does not exist" containerID="39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.577354 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d"} err="failed to get container status \"39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\": rpc error: code = NotFound desc = could not find container \"39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d\": container with ID starting with 39a64e56e1220c39017fe990710439d9d02242c2a43f755adeeab23d5b30ab1d not found: ID does not exist" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.577376 4762 scope.go:117] "RemoveContainer" containerID="ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee" Feb 17 14:09:16 crc kubenswrapper[4762]: E0217 14:09:16.577669 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\": container with ID starting with ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee not found: ID does not exist" containerID="ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.577702 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee"} err="failed to get container status \"ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\": rpc error: code = NotFound desc = could not find container \"ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee\": container with ID starting with ec6dade78436c7e01aadaa151821a8cf6efbfc65a041250d3d4c37236f1537ee not found: ID does not exist" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.577720 4762 scope.go:117] "RemoveContainer" containerID="c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1" Feb 17 14:09:16 crc kubenswrapper[4762]: E0217 14:09:16.577985 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\": container with ID starting with c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1 not found: ID does not exist" containerID="c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.578020 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1"} err="failed to get container status \"c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\": rpc error: code = NotFound desc = could not find container \"c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1\": container with ID starting with c2f2b0e916ac714dd72f635e153367c69a1939e2c41df1dbbcc834fa36a5eda1 not found: ID does not exist" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.578041 4762 scope.go:117] "RemoveContainer" containerID="11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310" Feb 17 14:09:16 crc kubenswrapper[4762]: E0217 14:09:16.578370 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\": container with ID starting with 11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310 not found: ID does not exist" containerID="11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.578401 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310"} err="failed to get container status \"11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\": rpc error: code = NotFound desc = could not find container \"11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310\": container with ID starting with 11e44159bba68cd3a47e144cf4e5dfdc2a4ca9682722bf377ec150aae6707310 not found: ID does not exist" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.578420 4762 scope.go:117] "RemoveContainer" containerID="4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a" Feb 17 14:09:16 crc kubenswrapper[4762]: E0217 14:09:16.578776 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\": container with ID starting with 4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a not found: ID does not exist" containerID="4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.578801 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a"} err="failed to get container status \"4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\": rpc error: code = NotFound desc = could not find container \"4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a\": container with ID starting with 4ceb30ddfbb0fca3eef902cf6c91b22f0d85412e621a9d669d0e34033565192a not found: ID does not exist" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.578822 4762 scope.go:117] "RemoveContainer" containerID="fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92" Feb 17 14:09:16 crc kubenswrapper[4762]: E0217 14:09:16.579260 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\": container with ID starting with fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92 not found: ID does not exist" containerID="fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.579477 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92"} err="failed to get container status \"fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\": rpc error: code = NotFound desc = could not find container \"fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92\": container with ID starting with fcadfa0dc60f098b16314818a5295a60cad034f917dc5ce65cc287a2d669dc92 not found: ID does not exist" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.815934 4762 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.816283 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4762]: I0217 14:09:16.816519 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:18 crc kubenswrapper[4762]: I0217 14:09:18.082161 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 14:09:19 crc kubenswrapper[4762]: E0217 14:09:19.112686 4762 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.214:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:19 crc kubenswrapper[4762]: I0217 14:09:19.114260 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:19 crc kubenswrapper[4762]: W0217 14:09:19.139965 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-7c41975268551269ac4b9927cc5aae26b693478ab874f4c3a6632483e9bbc7c5 WatchSource:0}: Error finding container 7c41975268551269ac4b9927cc5aae26b693478ab874f4c3a6632483e9bbc7c5: Status 404 returned error can't find the container with id 7c41975268551269ac4b9927cc5aae26b693478ab874f4c3a6632483e9bbc7c5 Feb 17 14:09:19 crc kubenswrapper[4762]: E0217 14:09:19.142287 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.214:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950df1750900bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:09:19.141961919 +0000 UTC m=+239.721962571,LastTimestamp:2026-02-17 14:09:19.141961919 +0000 UTC m=+239.721962571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:09:19 crc kubenswrapper[4762]: I0217 14:09:19.520245 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c24e78f16a94a50bbd85e9819c8aafedc56f306a423ddd3601b22c21d0c280fc"} Feb 17 14:09:19 crc kubenswrapper[4762]: I0217 14:09:19.520545 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7c41975268551269ac4b9927cc5aae26b693478ab874f4c3a6632483e9bbc7c5"} Feb 17 14:09:19 crc kubenswrapper[4762]: E0217 14:09:19.520969 4762 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.214:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:09:19 crc kubenswrapper[4762]: I0217 14:09:19.521638 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:19 crc kubenswrapper[4762]: I0217 14:09:19.522158 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:20 crc kubenswrapper[4762]: I0217 14:09:20.079670 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:20 crc kubenswrapper[4762]: I0217 14:09:20.080507 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:21 crc kubenswrapper[4762]: E0217 14:09:21.957376 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:21 crc kubenswrapper[4762]: E0217 14:09:21.958151 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:21 crc kubenswrapper[4762]: E0217 14:09:21.958509 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:21 crc kubenswrapper[4762]: E0217 14:09:21.958977 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:21 crc kubenswrapper[4762]: E0217 14:09:21.962896 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:21 crc kubenswrapper[4762]: I0217 14:09:21.962975 4762 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 14:09:21 crc kubenswrapper[4762]: E0217 14:09:21.963466 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="200ms" Feb 17 14:09:22 crc kubenswrapper[4762]: E0217 14:09:22.116927 4762 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.214:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" volumeName="registry-storage" Feb 17 14:09:22 crc kubenswrapper[4762]: E0217 14:09:22.164990 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="400ms" Feb 17 14:09:22 crc kubenswrapper[4762]: E0217 14:09:22.566297 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="800ms" Feb 17 14:09:22 crc kubenswrapper[4762]: E0217 14:09:22.997690 4762 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.214:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950df1750900bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:09:19.141961919 +0000 UTC m=+239.721962571,LastTimestamp:2026-02-17 14:09:19.141961919 +0000 UTC m=+239.721962571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.047963 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.048612 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.049516 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.050335 4762 status_manager.go:851] "Failed to get status for pod" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" pod="openshift-marketplace/redhat-operators-28cgn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-28cgn\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.082456 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.083018 4762 status_manager.go:851] "Failed to get status for pod" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" pod="openshift-marketplace/redhat-operators-28cgn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-28cgn\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.083599 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4762]: I0217 14:09:23.083907 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4762]: E0217 14:09:23.368255 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="1.6s" Feb 17 14:09:24 crc kubenswrapper[4762]: E0217 14:09:24.969353 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="3.2s" Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.564958 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.565342 4762 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2" exitCode=1 Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.565383 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2"} Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.565980 4762 scope.go:117] "RemoveContainer" containerID="92840c898194e870b17180920df9a613a9db0e262c7b53013635295c0db9d3d2" Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.566710 4762 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.567272 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.567620 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:27 crc kubenswrapper[4762]: I0217 14:09:27.568187 4762 status_manager.go:851] "Failed to get status for pod" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" pod="openshift-marketplace/redhat-operators-28cgn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-28cgn\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.070997 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.071879 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.072352 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.072778 4762 status_manager.go:851] "Failed to get status for pod" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" pod="openshift-marketplace/redhat-operators-28cgn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-28cgn\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.073276 4762 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.085542 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.085573 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:28 crc kubenswrapper[4762]: E0217 14:09:28.085995 4762 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.086412 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:28 crc kubenswrapper[4762]: W0217 14:09:28.105278 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2e043f9eb735609fa28955cc56bbda16f949d5af595a58df4b38afb62b6edfa5 WatchSource:0}: Error finding container 2e043f9eb735609fa28955cc56bbda16f949d5af595a58df4b38afb62b6edfa5: Status 404 returned error can't find the container with id 2e043f9eb735609fa28955cc56bbda16f949d5af595a58df4b38afb62b6edfa5 Feb 17 14:09:28 crc kubenswrapper[4762]: E0217 14:09:28.170286 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.214:6443: connect: connection refused" interval="6.4s" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.581916 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.582003 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"df9d57ec1b8121fc6a66157b54d52d76e81b1dd080c3a877d025eb084ad546f8"} Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.582765 4762 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.582974 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.583148 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.583351 4762 status_manager.go:851] "Failed to get status for pod" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" pod="openshift-marketplace/redhat-operators-28cgn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-28cgn\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.584366 4762 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="acb46ca5dc590dd91d074feb12a4851242636901d83e022f77642898ecb047fb" exitCode=0 Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.584413 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"acb46ca5dc590dd91d074feb12a4851242636901d83e022f77642898ecb047fb"} Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.584448 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2e043f9eb735609fa28955cc56bbda16f949d5af595a58df4b38afb62b6edfa5"} Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.584748 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.584765 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.584874 4762 status_manager.go:851] "Failed to get status for pod" podUID="4505d245-d558-4112-893d-75b19c128b09" pod="openshift-marketplace/community-operators-j27jc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-j27jc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: E0217 14:09:28.585142 4762 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.585256 4762 status_manager.go:851] "Failed to get status for pod" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" pod="openshift-marketplace/redhat-operators-28cgn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-28cgn\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.586006 4762 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.586461 4762 status_manager.go:851] "Failed to get status for pod" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.214:6443: connect: connection refused" Feb 17 14:09:28 crc kubenswrapper[4762]: I0217 14:09:28.741015 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:09:29 crc kubenswrapper[4762]: I0217 14:09:29.750716 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"36ca6fef9519b1e5514ec69f9d07c5d21240cd46395fe6015b294fddeba215ac"} Feb 17 14:09:29 crc kubenswrapper[4762]: I0217 14:09:29.750752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"248a7500ec205949f31adac78754a75357730b35dbd5bdfe4d46eacbdce4ab42"} Feb 17 14:09:29 crc kubenswrapper[4762]: I0217 14:09:29.750761 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"52e97c3530203597472f3c10c2ba36d871bc50d5b01f89e31d939e0eac4873ce"} Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.325029 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.325739 4762 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.325881 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.786542 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.786572 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.786968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cc4ee41190e16ee0464903d73a247b46f1c94c987edcc0247bb83c185325e7ed"} Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.786995 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"546953efe6c0390876597e7cf6d3e2a0112ef43070f42d14cdd6ed0da5ecfb60"} Feb 17 14:09:30 crc kubenswrapper[4762]: I0217 14:09:30.787028 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:33 crc kubenswrapper[4762]: I0217 14:09:33.086715 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:33 crc kubenswrapper[4762]: I0217 14:09:33.087812 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:33 crc kubenswrapper[4762]: I0217 14:09:33.091727 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:35 crc kubenswrapper[4762]: I0217 14:09:35.847572 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:35 crc kubenswrapper[4762]: I0217 14:09:35.904669 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b4d49f07-d131-40e4-abd1-ee4c505f7a6e" Feb 17 14:09:36 crc kubenswrapper[4762]: I0217 14:09:36.817935 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:36 crc kubenswrapper[4762]: I0217 14:09:36.818279 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:36 crc kubenswrapper[4762]: I0217 14:09:36.820631 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b4d49f07-d131-40e4-abd1-ee4c505f7a6e" Feb 17 14:09:36 crc kubenswrapper[4762]: I0217 14:09:36.822122 4762 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://52e97c3530203597472f3c10c2ba36d871bc50d5b01f89e31d939e0eac4873ce" Feb 17 14:09:36 crc kubenswrapper[4762]: I0217 14:09:36.822151 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:09:37 crc kubenswrapper[4762]: I0217 14:09:37.822264 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:37 crc kubenswrapper[4762]: I0217 14:09:37.822579 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8507903e-806f-4e57-bb1e-d218465a9ea3" Feb 17 14:09:37 crc kubenswrapper[4762]: I0217 14:09:37.825308 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b4d49f07-d131-40e4-abd1-ee4c505f7a6e" Feb 17 14:09:40 crc kubenswrapper[4762]: I0217 14:09:40.378040 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:09:40 crc kubenswrapper[4762]: I0217 14:09:40.381487 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:09:45 crc kubenswrapper[4762]: I0217 14:09:45.485064 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 14:09:45 crc kubenswrapper[4762]: I0217 14:09:45.662794 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 14:09:45 crc kubenswrapper[4762]: I0217 14:09:45.830618 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 14:09:46 crc kubenswrapper[4762]: I0217 14:09:46.597198 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 14:09:46 crc kubenswrapper[4762]: I0217 14:09:46.902889 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 14:09:47 crc kubenswrapper[4762]: I0217 14:09:47.027660 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 14:09:47 crc kubenswrapper[4762]: I0217 14:09:47.629345 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 14:09:47 crc kubenswrapper[4762]: I0217 14:09:47.657354 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.005381 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.023321 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.028932 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.358485 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.467148 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.498349 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.532408 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.844323 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.868697 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.886901 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 14:09:48 crc kubenswrapper[4762]: I0217 14:09:48.954428 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.115102 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.125812 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.154825 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.164995 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.235980 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.296623 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.344080 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.346531 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.366461 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.460333 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.461935 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.465741 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.524093 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.529581 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.552036 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.582306 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.620297 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.622248 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.680430 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.700037 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.707141 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.767520 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 14:09:49 crc kubenswrapper[4762]: I0217 14:09:49.838941 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.033264 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.086747 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.107265 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.110401 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.116813 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.185171 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.229724 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.264414 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.268659 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.305850 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.367688 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.431576 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.468289 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.514713 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.663626 4762 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.713941 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.765072 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.910675 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.922800 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 14:09:50 crc kubenswrapper[4762]: I0217 14:09:50.965987 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.131233 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.154334 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.208989 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.293711 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.398185 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.460801 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.465513 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.477851 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.528979 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.555628 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.604843 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.636683 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.645058 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.675357 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 14:09:51 crc kubenswrapper[4762]: I0217 14:09:51.869313 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.078062 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.128149 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.131292 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.131450 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.159245 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.197997 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.314593 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.395103 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.395417 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.435677 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.560121 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.635215 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.712826 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.714418 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.731015 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.821741 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.835260 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.891987 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.928581 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.943311 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 14:09:52 crc kubenswrapper[4762]: I0217 14:09:52.997150 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.033659 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.058877 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.123270 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.133101 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.142314 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.202372 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.211464 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.270403 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.320356 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.355423 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.365321 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.465538 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.500233 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.611819 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.706084 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.863060 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.891111 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.930479 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 14:09:53 crc kubenswrapper[4762]: I0217 14:09:53.963958 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.151450 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.231473 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.263071 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.272147 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.292636 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.520897 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.558609 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.623355 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.654598 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.741059 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.764931 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.783111 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.818201 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.820097 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.885800 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.910870 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.942364 4762 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 14:09:54 crc kubenswrapper[4762]: I0217 14:09:54.967298 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.051638 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.150190 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.190413 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.209207 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.262969 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.338283 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.446871 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.457226 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.649710 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.670032 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.699127 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.700318 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.712970 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.865434 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 14:09:55 crc kubenswrapper[4762]: I0217 14:09:55.964991 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.043777 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.064460 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.064460 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.140081 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.261335 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.372733 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.406470 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.444594 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.483276 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.543026 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.570628 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.636884 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.684484 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.701230 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.735868 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.770092 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.784503 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.818787 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.822131 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.832714 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.876602 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.901522 4762 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:09:56 crc kubenswrapper[4762]: I0217 14:09:56.921691 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.087332 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.088788 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.091248 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.140577 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.203285 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.306308 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.352280 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.463164 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.489895 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.547489 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.553764 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.643981 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.741359 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.815704 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.817447 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.842050 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.879450 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.900462 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.935171 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 14:09:57 crc kubenswrapper[4762]: I0217 14:09:57.979976 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.013408 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.137956 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.191147 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.290471 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.338749 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.362429 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.380964 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.418458 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.573244 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.584256 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.696126 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.730702 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.732805 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.748070 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.827051 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.866880 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.941395 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:09:58 crc kubenswrapper[4762]: I0217 14:09:58.998278 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.020706 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.033230 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.109629 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.131397 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.437512 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.445340 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.466537 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.506534 4762 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.625398 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.644453 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.758137 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.759323 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.762114 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.835121 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.857016 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 14:09:59 crc kubenswrapper[4762]: I0217 14:09:59.953544 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.005125 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.006061 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.093527 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.233485 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.483093 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.518340 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.544637 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.596420 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.731486 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.746531 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 14:10:00 crc kubenswrapper[4762]: I0217 14:10:00.989484 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.013982 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.030940 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.432613 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.445958 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.510210 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.659472 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:10:01 crc kubenswrapper[4762]: I0217 14:10:01.717960 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.141193 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.145418 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.161458 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.252340 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.537861 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.576296 4762 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.581081 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j27jc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.581156 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.586487 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.601978 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=27.601961664 podStartE2EDuration="27.601961664s" podCreationTimestamp="2026-02-17 14:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:10:02.598870259 +0000 UTC m=+283.178870911" watchObservedRunningTime="2026-02-17 14:10:02.601961664 +0000 UTC m=+283.181962316" Feb 17 14:10:02 crc kubenswrapper[4762]: I0217 14:10:02.697288 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 14:10:04 crc kubenswrapper[4762]: I0217 14:10:04.078942 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4505d245-d558-4112-893d-75b19c128b09" path="/var/lib/kubelet/pods/4505d245-d558-4112-893d-75b19c128b09/volumes" Feb 17 14:10:08 crc kubenswrapper[4762]: I0217 14:10:08.632211 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:10:08 crc kubenswrapper[4762]: I0217 14:10:08.632827 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c24e78f16a94a50bbd85e9819c8aafedc56f306a423ddd3601b22c21d0c280fc" gracePeriod=5 Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.061904 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.062503 4762 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c24e78f16a94a50bbd85e9819c8aafedc56f306a423ddd3601b22c21d0c280fc" exitCode=137 Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.199315 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.199390 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331287 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331383 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331453 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331452 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331482 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331511 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331559 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331662 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331689 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331873 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331898 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331910 4762 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.331922 4762 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.342840 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:14 crc kubenswrapper[4762]: I0217 14:10:14.432721 4762 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:15 crc kubenswrapper[4762]: I0217 14:10:15.070166 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:10:15 crc kubenswrapper[4762]: I0217 14:10:15.070465 4762 scope.go:117] "RemoveContainer" containerID="c24e78f16a94a50bbd85e9819c8aafedc56f306a423ddd3601b22c21d0c280fc" Feb 17 14:10:15 crc kubenswrapper[4762]: I0217 14:10:15.070488 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:16 crc kubenswrapper[4762]: I0217 14:10:16.078381 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 14:10:19 crc kubenswrapper[4762]: I0217 14:10:19.099839 4762 generic.go:334] "Generic (PLEG): container finished" podID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerID="7a6ea7dcc9688017aa6d85d9918ae68333a411dddb372839ae3e4d61cf15c960" exitCode=0 Feb 17 14:10:19 crc kubenswrapper[4762]: I0217 14:10:19.099899 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" event={"ID":"2822ca68-2d20-4f3c-93aa-38f63a418c69","Type":"ContainerDied","Data":"7a6ea7dcc9688017aa6d85d9918ae68333a411dddb372839ae3e4d61cf15c960"} Feb 17 14:10:19 crc kubenswrapper[4762]: I0217 14:10:19.100421 4762 scope.go:117] "RemoveContainer" containerID="7a6ea7dcc9688017aa6d85d9918ae68333a411dddb372839ae3e4d61cf15c960" Feb 17 14:10:19 crc kubenswrapper[4762]: I0217 14:10:19.887714 4762 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 14:10:20 crc kubenswrapper[4762]: I0217 14:10:20.106980 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" event={"ID":"2822ca68-2d20-4f3c-93aa-38f63a418c69","Type":"ContainerStarted","Data":"fbb7165e310ac8915278a1ab594016ad0bdda7c965fa741a3de68c7a1fa07588"} Feb 17 14:10:20 crc kubenswrapper[4762]: I0217 14:10:20.107668 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:10:20 crc kubenswrapper[4762]: I0217 14:10:20.116634 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:10:29 crc kubenswrapper[4762]: I0217 14:10:29.895611 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-58fnv"] Feb 17 14:10:29 crc kubenswrapper[4762]: I0217 14:10:29.896416 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" podUID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" containerName="controller-manager" containerID="cri-o://8fde1cc2cbe99f8191e2b326908699fbb48ef74fea2039f786b1dc33059b8407" gracePeriod=30 Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.005115 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd"] Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.005307 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" podUID="a57a8269-657e-49f2-8edb-189e9f69f1b4" containerName="route-controller-manager" containerID="cri-o://d0be7f9a275847575913aafbe2fd9d7e9bfed6f9d3f92e11d83afdf2556453c3" gracePeriod=30 Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.174278 4762 generic.go:334] "Generic (PLEG): container finished" podID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" containerID="8fde1cc2cbe99f8191e2b326908699fbb48ef74fea2039f786b1dc33059b8407" exitCode=0 Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.174603 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" event={"ID":"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40","Type":"ContainerDied","Data":"8fde1cc2cbe99f8191e2b326908699fbb48ef74fea2039f786b1dc33059b8407"} Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.176195 4762 generic.go:334] "Generic (PLEG): container finished" podID="a57a8269-657e-49f2-8edb-189e9f69f1b4" containerID="d0be7f9a275847575913aafbe2fd9d7e9bfed6f9d3f92e11d83afdf2556453c3" exitCode=0 Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.176228 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" event={"ID":"a57a8269-657e-49f2-8edb-189e9f69f1b4","Type":"ContainerDied","Data":"d0be7f9a275847575913aafbe2fd9d7e9bfed6f9d3f92e11d83afdf2556453c3"} Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.246550 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.343525 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.433593 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-proxy-ca-bundles\") pod \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.433743 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-config\") pod \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.433800 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-client-ca\") pod \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.434066 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48v72\" (UniqueName: \"kubernetes.io/projected/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-kube-api-access-48v72\") pod \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.434132 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-serving-cert\") pod \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\" (UID: \"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.434408 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" (UID: "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.434468 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-config" (OuterVolumeSpecName: "config") pod "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" (UID: "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.435286 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" (UID: "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.439736 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" (UID: "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.439764 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-kube-api-access-48v72" (OuterVolumeSpecName: "kube-api-access-48v72") pod "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" (UID: "d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40"). InnerVolumeSpecName "kube-api-access-48v72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535091 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-client-ca\") pod \"a57a8269-657e-49f2-8edb-189e9f69f1b4\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535152 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57a8269-657e-49f2-8edb-189e9f69f1b4-serving-cert\") pod \"a57a8269-657e-49f2-8edb-189e9f69f1b4\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535213 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-config\") pod \"a57a8269-657e-49f2-8edb-189e9f69f1b4\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535243 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjr97\" (UniqueName: \"kubernetes.io/projected/a57a8269-657e-49f2-8edb-189e9f69f1b4-kube-api-access-qjr97\") pod \"a57a8269-657e-49f2-8edb-189e9f69f1b4\" (UID: \"a57a8269-657e-49f2-8edb-189e9f69f1b4\") " Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535445 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48v72\" (UniqueName: \"kubernetes.io/projected/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-kube-api-access-48v72\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535456 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535464 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535473 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.535482 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.536296 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "a57a8269-657e-49f2-8edb-189e9f69f1b4" (UID: "a57a8269-657e-49f2-8edb-189e9f69f1b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.536302 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-config" (OuterVolumeSpecName: "config") pod "a57a8269-657e-49f2-8edb-189e9f69f1b4" (UID: "a57a8269-657e-49f2-8edb-189e9f69f1b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.538680 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57a8269-657e-49f2-8edb-189e9f69f1b4-kube-api-access-qjr97" (OuterVolumeSpecName: "kube-api-access-qjr97") pod "a57a8269-657e-49f2-8edb-189e9f69f1b4" (UID: "a57a8269-657e-49f2-8edb-189e9f69f1b4"). InnerVolumeSpecName "kube-api-access-qjr97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.538800 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57a8269-657e-49f2-8edb-189e9f69f1b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a57a8269-657e-49f2-8edb-189e9f69f1b4" (UID: "a57a8269-657e-49f2-8edb-189e9f69f1b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.636190 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57a8269-657e-49f2-8edb-189e9f69f1b4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.636245 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.636257 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjr97\" (UniqueName: \"kubernetes.io/projected/a57a8269-657e-49f2-8edb-189e9f69f1b4-kube-api-access-qjr97\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:30 crc kubenswrapper[4762]: I0217 14:10:30.636267 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a57a8269-657e-49f2-8edb-189e9f69f1b4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.184624 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.184621 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd" event={"ID":"a57a8269-657e-49f2-8edb-189e9f69f1b4","Type":"ContainerDied","Data":"25576de0dbc476e17785eb2deb3ed267114711ee7feca36b6ab70372d4a42c6f"} Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.185093 4762 scope.go:117] "RemoveContainer" containerID="d0be7f9a275847575913aafbe2fd9d7e9bfed6f9d3f92e11d83afdf2556453c3" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.189279 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.189187 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-58fnv" event={"ID":"d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40","Type":"ContainerDied","Data":"816a1f341fc58bc9adfc9fdb1598493e84f557a65a05e024b91f1c3b7c746a1d"} Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.203923 4762 scope.go:117] "RemoveContainer" containerID="8fde1cc2cbe99f8191e2b326908699fbb48ef74fea2039f786b1dc33059b8407" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.215393 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.219189 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8gksd"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.222749 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-58fnv"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.225881 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-58fnv"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562216 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5"] Feb 17 14:10:31 crc kubenswrapper[4762]: E0217 14:10:31.562575 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57a8269-657e-49f2-8edb-189e9f69f1b4" containerName="route-controller-manager" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562591 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57a8269-657e-49f2-8edb-189e9f69f1b4" containerName="route-controller-manager" Feb 17 14:10:31 crc kubenswrapper[4762]: E0217 14:10:31.562619 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562628 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:10:31 crc kubenswrapper[4762]: E0217 14:10:31.562660 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" containerName="controller-manager" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562672 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" containerName="controller-manager" Feb 17 14:10:31 crc kubenswrapper[4762]: E0217 14:10:31.562682 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" containerName="installer" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562690 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" containerName="installer" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562836 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562857 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" containerName="controller-manager" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562867 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac57045-b522-4701-8c80-c3fdf4aaeb14" containerName="installer" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.562884 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57a8269-657e-49f2-8edb-189e9f69f1b4" containerName="route-controller-manager" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.563374 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.565548 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.566344 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.566553 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.566846 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.566916 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.567037 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.567352 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.570067 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.570071 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.570954 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.571136 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.571180 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.571311 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.572139 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.577359 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.578774 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.584728 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d"] Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.749760 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-client-ca\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.749831 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68936b54-d9ef-46f6-8781-f6793c92ad62-serving-cert\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.749882 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-client-ca\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.749921 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb52ff-a8cf-443a-81b4-c998fbdd5789-serving-cert\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.749956 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsb2z\" (UniqueName: \"kubernetes.io/projected/25fb52ff-a8cf-443a-81b4-c998fbdd5789-kube-api-access-qsb2z\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.750008 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-proxy-ca-bundles\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.750048 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-config\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.750082 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcwfc\" (UniqueName: \"kubernetes.io/projected/68936b54-d9ef-46f6-8781-f6793c92ad62-kube-api-access-dcwfc\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.750320 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-config\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.851924 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-client-ca\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.851991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68936b54-d9ef-46f6-8781-f6793c92ad62-serving-cert\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852021 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-client-ca\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852045 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb52ff-a8cf-443a-81b4-c998fbdd5789-serving-cert\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852070 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsb2z\" (UniqueName: \"kubernetes.io/projected/25fb52ff-a8cf-443a-81b4-c998fbdd5789-kube-api-access-qsb2z\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852112 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-proxy-ca-bundles\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852149 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-config\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852177 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcwfc\" (UniqueName: \"kubernetes.io/projected/68936b54-d9ef-46f6-8781-f6793c92ad62-kube-api-access-dcwfc\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.852238 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-config\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.853281 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-client-ca\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.853584 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-proxy-ca-bundles\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.854548 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-config\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.854988 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-config\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.855174 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-client-ca\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.857993 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68936b54-d9ef-46f6-8781-f6793c92ad62-serving-cert\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.859054 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb52ff-a8cf-443a-81b4-c998fbdd5789-serving-cert\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.876261 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsb2z\" (UniqueName: \"kubernetes.io/projected/25fb52ff-a8cf-443a-81b4-c998fbdd5789-kube-api-access-qsb2z\") pod \"route-controller-manager-86c679cff5-bq89d\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.876839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcwfc\" (UniqueName: \"kubernetes.io/projected/68936b54-d9ef-46f6-8781-f6793c92ad62-kube-api-access-dcwfc\") pod \"controller-manager-7ff5bf444c-7w8m5\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.888013 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:31 crc kubenswrapper[4762]: I0217 14:10:31.894223 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:32 crc kubenswrapper[4762]: I0217 14:10:32.081272 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57a8269-657e-49f2-8edb-189e9f69f1b4" path="/var/lib/kubelet/pods/a57a8269-657e-49f2-8edb-189e9f69f1b4/volumes" Feb 17 14:10:32 crc kubenswrapper[4762]: I0217 14:10:32.082926 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40" path="/var/lib/kubelet/pods/d1ac2af6-e83a-45b3-b0f3-dbbfe7874c40/volumes" Feb 17 14:10:32 crc kubenswrapper[4762]: I0217 14:10:32.121326 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5"] Feb 17 14:10:32 crc kubenswrapper[4762]: I0217 14:10:32.186338 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d"] Feb 17 14:10:32 crc kubenswrapper[4762]: W0217 14:10:32.196470 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fb52ff_a8cf_443a_81b4_c998fbdd5789.slice/crio-c5f03a651a94e7ac0fd329a89afe9b47cd922dbedf63090a0100d1472f06c4fd WatchSource:0}: Error finding container c5f03a651a94e7ac0fd329a89afe9b47cd922dbedf63090a0100d1472f06c4fd: Status 404 returned error can't find the container with id c5f03a651a94e7ac0fd329a89afe9b47cd922dbedf63090a0100d1472f06c4fd Feb 17 14:10:32 crc kubenswrapper[4762]: I0217 14:10:32.202325 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" event={"ID":"68936b54-d9ef-46f6-8781-f6793c92ad62","Type":"ContainerStarted","Data":"d38421becd96c0ea60fe6ea956a1b2ead566006da282903a3b08e1debc3f9a7e"} Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.209196 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" event={"ID":"68936b54-d9ef-46f6-8781-f6793c92ad62","Type":"ContainerStarted","Data":"9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5"} Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.209558 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.213429 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" event={"ID":"25fb52ff-a8cf-443a-81b4-c998fbdd5789","Type":"ContainerStarted","Data":"0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875"} Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.213472 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" event={"ID":"25fb52ff-a8cf-443a-81b4-c998fbdd5789","Type":"ContainerStarted","Data":"c5f03a651a94e7ac0fd329a89afe9b47cd922dbedf63090a0100d1472f06c4fd"} Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.213691 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.215720 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.220107 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.224008 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" podStartSLOduration=3.22399157 podStartE2EDuration="3.22399157s" podCreationTimestamp="2026-02-17 14:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:10:33.223741553 +0000 UTC m=+313.803742215" watchObservedRunningTime="2026-02-17 14:10:33.22399157 +0000 UTC m=+313.803992222" Feb 17 14:10:33 crc kubenswrapper[4762]: I0217 14:10:33.254604 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" podStartSLOduration=3.254586189 podStartE2EDuration="3.254586189s" podCreationTimestamp="2026-02-17 14:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:10:33.253221631 +0000 UTC m=+313.833222293" watchObservedRunningTime="2026-02-17 14:10:33.254586189 +0000 UTC m=+313.834586831" Feb 17 14:10:51 crc kubenswrapper[4762]: I0217 14:10:51.716610 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hv4vz"] Feb 17 14:10:51 crc kubenswrapper[4762]: I0217 14:10:51.717501 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hv4vz" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="registry-server" containerID="cri-o://e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3" gracePeriod=2 Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.109580 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7zdn"] Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.110055 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q7zdn" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="registry-server" containerID="cri-o://74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7" gracePeriod=2 Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.131709 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.302379 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-catalog-content\") pod \"2f1332eb-9672-4d20-b2e4-4d26287d6464\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.302764 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-utilities\") pod \"2f1332eb-9672-4d20-b2e4-4d26287d6464\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.302826 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt4jg\" (UniqueName: \"kubernetes.io/projected/2f1332eb-9672-4d20-b2e4-4d26287d6464-kube-api-access-tt4jg\") pod \"2f1332eb-9672-4d20-b2e4-4d26287d6464\" (UID: \"2f1332eb-9672-4d20-b2e4-4d26287d6464\") " Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.307379 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-utilities" (OuterVolumeSpecName: "utilities") pod "2f1332eb-9672-4d20-b2e4-4d26287d6464" (UID: "2f1332eb-9672-4d20-b2e4-4d26287d6464"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.309130 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1332eb-9672-4d20-b2e4-4d26287d6464-kube-api-access-tt4jg" (OuterVolumeSpecName: "kube-api-access-tt4jg") pod "2f1332eb-9672-4d20-b2e4-4d26287d6464" (UID: "2f1332eb-9672-4d20-b2e4-4d26287d6464"). InnerVolumeSpecName "kube-api-access-tt4jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.323324 4762 generic.go:334] "Generic (PLEG): container finished" podID="a1770df5-1061-4617-91ae-3909f5fe514f" containerID="74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7" exitCode=0 Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.323382 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerDied","Data":"74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7"} Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.325561 4762 generic.go:334] "Generic (PLEG): container finished" podID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerID="e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3" exitCode=0 Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.325584 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerDied","Data":"e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3"} Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.325599 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv4vz" event={"ID":"2f1332eb-9672-4d20-b2e4-4d26287d6464","Type":"ContainerDied","Data":"8136ff1e3a40df4a9508f1c5626cd8fd8c81c3c67cc8c996271b31f948307289"} Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.325618 4762 scope.go:117] "RemoveContainer" containerID="e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.325665 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv4vz" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.345130 4762 scope.go:117] "RemoveContainer" containerID="62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c" Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.356363 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7 is running failed: container process not found" containerID="74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.356861 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7 is running failed: container process not found" containerID="74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.358722 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7 is running failed: container process not found" containerID="74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.358787 4762 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-q7zdn" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="registry-server" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.380898 4762 scope.go:117] "RemoveContainer" containerID="4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.405283 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.405316 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt4jg\" (UniqueName: \"kubernetes.io/projected/2f1332eb-9672-4d20-b2e4-4d26287d6464-kube-api-access-tt4jg\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.406829 4762 scope.go:117] "RemoveContainer" containerID="e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3" Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.407307 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3\": container with ID starting with e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3 not found: ID does not exist" containerID="e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.407372 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3"} err="failed to get container status \"e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3\": rpc error: code = NotFound desc = could not find container \"e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3\": container with ID starting with e2a89f35a6928ac026f1335d8a1459ebffb7a6c188ca46e6e0199400a81a83c3 not found: ID does not exist" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.407400 4762 scope.go:117] "RemoveContainer" containerID="62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c" Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.407732 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c\": container with ID starting with 62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c not found: ID does not exist" containerID="62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.407763 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c"} err="failed to get container status \"62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c\": rpc error: code = NotFound desc = could not find container \"62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c\": container with ID starting with 62289f1944fbfbecac15b8da8ab806407604814540f0b14349d290945ab7fe7c not found: ID does not exist" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.407791 4762 scope.go:117] "RemoveContainer" containerID="4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03" Feb 17 14:10:52 crc kubenswrapper[4762]: E0217 14:10:52.407993 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03\": container with ID starting with 4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03 not found: ID does not exist" containerID="4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.408014 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03"} err="failed to get container status \"4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03\": rpc error: code = NotFound desc = could not find container \"4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03\": container with ID starting with 4bf8f3aea76e41517727284beefa955730e3ab70ca3f3479e525db4f25496b03 not found: ID does not exist" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.455237 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f1332eb-9672-4d20-b2e4-4d26287d6464" (UID: "2f1332eb-9672-4d20-b2e4-4d26287d6464"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.506017 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f1332eb-9672-4d20-b2e4-4d26287d6464-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.523672 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.650809 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hv4vz"] Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.654242 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hv4vz"] Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.707982 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsn6p\" (UniqueName: \"kubernetes.io/projected/a1770df5-1061-4617-91ae-3909f5fe514f-kube-api-access-jsn6p\") pod \"a1770df5-1061-4617-91ae-3909f5fe514f\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.708106 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-catalog-content\") pod \"a1770df5-1061-4617-91ae-3909f5fe514f\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.708152 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-utilities\") pod \"a1770df5-1061-4617-91ae-3909f5fe514f\" (UID: \"a1770df5-1061-4617-91ae-3909f5fe514f\") " Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.709016 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-utilities" (OuterVolumeSpecName: "utilities") pod "a1770df5-1061-4617-91ae-3909f5fe514f" (UID: "a1770df5-1061-4617-91ae-3909f5fe514f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.711349 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1770df5-1061-4617-91ae-3909f5fe514f-kube-api-access-jsn6p" (OuterVolumeSpecName: "kube-api-access-jsn6p") pod "a1770df5-1061-4617-91ae-3909f5fe514f" (UID: "a1770df5-1061-4617-91ae-3909f5fe514f"). InnerVolumeSpecName "kube-api-access-jsn6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.729475 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1770df5-1061-4617-91ae-3909f5fe514f" (UID: "a1770df5-1061-4617-91ae-3909f5fe514f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.809919 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.809961 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1770df5-1061-4617-91ae-3909f5fe514f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:52 crc kubenswrapper[4762]: I0217 14:10:52.809977 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsn6p\" (UniqueName: \"kubernetes.io/projected/a1770df5-1061-4617-91ae-3909f5fe514f-kube-api-access-jsn6p\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.335302 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7zdn" event={"ID":"a1770df5-1061-4617-91ae-3909f5fe514f","Type":"ContainerDied","Data":"3de3f28d7b4934a0b540b1578eed346837435b4b6940f8b9ef45d3b97142cd7d"} Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.335354 4762 scope.go:117] "RemoveContainer" containerID="74494455b8004875e23e111458c477013d4aca37c563957eff1ca0bac9df3de7" Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.335404 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7zdn" Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.359852 4762 scope.go:117] "RemoveContainer" containerID="2490c7b9ab2f1f553722df509e44c8d2bb12bbe29fa6b51a4b64addb84ea43fd" Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.379128 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7zdn"] Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.383225 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7zdn"] Feb 17 14:10:53 crc kubenswrapper[4762]: I0217 14:10:53.394350 4762 scope.go:117] "RemoveContainer" containerID="9c84f9c706f800efebe3783429ec9d551d4a7e4cf2786d005b3382c519c861bb" Feb 17 14:10:54 crc kubenswrapper[4762]: I0217 14:10:54.082263 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" path="/var/lib/kubelet/pods/2f1332eb-9672-4d20-b2e4-4d26287d6464/volumes" Feb 17 14:10:54 crc kubenswrapper[4762]: I0217 14:10:54.083890 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" path="/var/lib/kubelet/pods/a1770df5-1061-4617-91ae-3909f5fe514f/volumes" Feb 17 14:11:24 crc kubenswrapper[4762]: I0217 14:11:24.622178 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:11:24 crc kubenswrapper[4762]: I0217 14:11:24.622802 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597501 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hvzzr"] Feb 17 14:11:25 crc kubenswrapper[4762]: E0217 14:11:25.597722 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="extract-content" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597735 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="extract-content" Feb 17 14:11:25 crc kubenswrapper[4762]: E0217 14:11:25.597747 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="registry-server" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597752 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="registry-server" Feb 17 14:11:25 crc kubenswrapper[4762]: E0217 14:11:25.597761 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="extract-utilities" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597767 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="extract-utilities" Feb 17 14:11:25 crc kubenswrapper[4762]: E0217 14:11:25.597774 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="extract-content" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597781 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="extract-content" Feb 17 14:11:25 crc kubenswrapper[4762]: E0217 14:11:25.597793 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="extract-utilities" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597798 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="extract-utilities" Feb 17 14:11:25 crc kubenswrapper[4762]: E0217 14:11:25.597811 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="registry-server" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597817 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="registry-server" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597903 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1770df5-1061-4617-91ae-3909f5fe514f" containerName="registry-server" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.597919 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1332eb-9672-4d20-b2e4-4d26287d6464" containerName="registry-server" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.598303 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.618308 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hvzzr"] Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741184 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-bound-sa-token\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741243 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741275 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-trusted-ca\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741310 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pg87\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-kube-api-access-5pg87\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741427 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741533 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-registry-tls\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741560 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.741582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-registry-certificates\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.760852 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842674 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-registry-tls\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842733 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-registry-certificates\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842819 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-bound-sa-token\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842865 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-trusted-ca\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.842893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pg87\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-kube-api-access-5pg87\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.843577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.844491 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-registry-certificates\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.845036 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-trusted-ca\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.848897 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.849051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-registry-tls\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.859972 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-bound-sa-token\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.861035 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pg87\" (UniqueName: \"kubernetes.io/projected/076e87d2-3317-4b8a-8cfa-cdea8b2dc01c-kube-api-access-5pg87\") pod \"image-registry-66df7c8f76-hvzzr\" (UID: \"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c\") " pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:25 crc kubenswrapper[4762]: I0217 14:11:25.916406 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:26 crc kubenswrapper[4762]: I0217 14:11:26.346025 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hvzzr"] Feb 17 14:11:26 crc kubenswrapper[4762]: I0217 14:11:26.512840 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" event={"ID":"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c","Type":"ContainerStarted","Data":"cd3ec170f18eea17ca5d5f97e48bb03379882643c89ec25f81b525b404d3af5c"} Feb 17 14:11:26 crc kubenswrapper[4762]: I0217 14:11:26.512882 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" event={"ID":"076e87d2-3317-4b8a-8cfa-cdea8b2dc01c","Type":"ContainerStarted","Data":"9cea088f9f96c3b8d72fc44bda219eb7af84861552fb7abd38e0da3fd8add952"} Feb 17 14:11:26 crc kubenswrapper[4762]: I0217 14:11:26.512985 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:26 crc kubenswrapper[4762]: I0217 14:11:26.531802 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" podStartSLOduration=1.531783774 podStartE2EDuration="1.531783774s" podCreationTimestamp="2026-02-17 14:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:26.531780804 +0000 UTC m=+367.111781466" watchObservedRunningTime="2026-02-17 14:11:26.531783774 +0000 UTC m=+367.111784426" Feb 17 14:11:29 crc kubenswrapper[4762]: I0217 14:11:29.865485 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5"] Feb 17 14:11:29 crc kubenswrapper[4762]: I0217 14:11:29.866125 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" podUID="68936b54-d9ef-46f6-8781-f6793c92ad62" containerName="controller-manager" containerID="cri-o://9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5" gracePeriod=30 Feb 17 14:11:29 crc kubenswrapper[4762]: I0217 14:11:29.884422 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d"] Feb 17 14:11:29 crc kubenswrapper[4762]: I0217 14:11:29.884684 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" podUID="25fb52ff-a8cf-443a-81b4-c998fbdd5789" containerName="route-controller-manager" containerID="cri-o://0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875" gracePeriod=30 Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.266871 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.272516 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403063 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcwfc\" (UniqueName: \"kubernetes.io/projected/68936b54-d9ef-46f6-8781-f6793c92ad62-kube-api-access-dcwfc\") pod \"68936b54-d9ef-46f6-8781-f6793c92ad62\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403114 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsb2z\" (UniqueName: \"kubernetes.io/projected/25fb52ff-a8cf-443a-81b4-c998fbdd5789-kube-api-access-qsb2z\") pod \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403148 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-client-ca\") pod \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403177 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-client-ca\") pod \"68936b54-d9ef-46f6-8781-f6793c92ad62\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403196 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb52ff-a8cf-443a-81b4-c998fbdd5789-serving-cert\") pod \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403222 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68936b54-d9ef-46f6-8781-f6793c92ad62-serving-cert\") pod \"68936b54-d9ef-46f6-8781-f6793c92ad62\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403288 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-proxy-ca-bundles\") pod \"68936b54-d9ef-46f6-8781-f6793c92ad62\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403351 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-config\") pod \"68936b54-d9ef-46f6-8781-f6793c92ad62\" (UID: \"68936b54-d9ef-46f6-8781-f6793c92ad62\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.403381 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-config\") pod \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\" (UID: \"25fb52ff-a8cf-443a-81b4-c998fbdd5789\") " Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.404118 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-client-ca" (OuterVolumeSpecName: "client-ca") pod "25fb52ff-a8cf-443a-81b4-c998fbdd5789" (UID: "25fb52ff-a8cf-443a-81b4-c998fbdd5789"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.404377 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "68936b54-d9ef-46f6-8781-f6793c92ad62" (UID: "68936b54-d9ef-46f6-8781-f6793c92ad62"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.404395 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-client-ca" (OuterVolumeSpecName: "client-ca") pod "68936b54-d9ef-46f6-8781-f6793c92ad62" (UID: "68936b54-d9ef-46f6-8781-f6793c92ad62"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.404553 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-config" (OuterVolumeSpecName: "config") pod "68936b54-d9ef-46f6-8781-f6793c92ad62" (UID: "68936b54-d9ef-46f6-8781-f6793c92ad62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.404637 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-config" (OuterVolumeSpecName: "config") pod "25fb52ff-a8cf-443a-81b4-c998fbdd5789" (UID: "25fb52ff-a8cf-443a-81b4-c998fbdd5789"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.408816 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68936b54-d9ef-46f6-8781-f6793c92ad62-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68936b54-d9ef-46f6-8781-f6793c92ad62" (UID: "68936b54-d9ef-46f6-8781-f6793c92ad62"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.408882 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fb52ff-a8cf-443a-81b4-c998fbdd5789-kube-api-access-qsb2z" (OuterVolumeSpecName: "kube-api-access-qsb2z") pod "25fb52ff-a8cf-443a-81b4-c998fbdd5789" (UID: "25fb52ff-a8cf-443a-81b4-c998fbdd5789"). InnerVolumeSpecName "kube-api-access-qsb2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.408909 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68936b54-d9ef-46f6-8781-f6793c92ad62-kube-api-access-dcwfc" (OuterVolumeSpecName: "kube-api-access-dcwfc") pod "68936b54-d9ef-46f6-8781-f6793c92ad62" (UID: "68936b54-d9ef-46f6-8781-f6793c92ad62"). InnerVolumeSpecName "kube-api-access-dcwfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.415817 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25fb52ff-a8cf-443a-81b4-c998fbdd5789-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25fb52ff-a8cf-443a-81b4-c998fbdd5789" (UID: "25fb52ff-a8cf-443a-81b4-c998fbdd5789"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504303 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504342 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504358 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcwfc\" (UniqueName: \"kubernetes.io/projected/68936b54-d9ef-46f6-8781-f6793c92ad62-kube-api-access-dcwfc\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504373 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsb2z\" (UniqueName: \"kubernetes.io/projected/25fb52ff-a8cf-443a-81b4-c998fbdd5789-kube-api-access-qsb2z\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504385 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fb52ff-a8cf-443a-81b4-c998fbdd5789-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504394 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504402 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fb52ff-a8cf-443a-81b4-c998fbdd5789-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504410 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68936b54-d9ef-46f6-8781-f6793c92ad62-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.504417 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68936b54-d9ef-46f6-8781-f6793c92ad62-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.533220 4762 generic.go:334] "Generic (PLEG): container finished" podID="68936b54-d9ef-46f6-8781-f6793c92ad62" containerID="9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5" exitCode=0 Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.533274 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" event={"ID":"68936b54-d9ef-46f6-8781-f6793c92ad62","Type":"ContainerDied","Data":"9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5"} Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.533677 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" event={"ID":"68936b54-d9ef-46f6-8781-f6793c92ad62","Type":"ContainerDied","Data":"d38421becd96c0ea60fe6ea956a1b2ead566006da282903a3b08e1debc3f9a7e"} Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.533828 4762 scope.go:117] "RemoveContainer" containerID="9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.533288 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.536834 4762 generic.go:334] "Generic (PLEG): container finished" podID="25fb52ff-a8cf-443a-81b4-c998fbdd5789" containerID="0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875" exitCode=0 Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.536873 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" event={"ID":"25fb52ff-a8cf-443a-81b4-c998fbdd5789","Type":"ContainerDied","Data":"0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875"} Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.536892 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.536901 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d" event={"ID":"25fb52ff-a8cf-443a-81b4-c998fbdd5789","Type":"ContainerDied","Data":"c5f03a651a94e7ac0fd329a89afe9b47cd922dbedf63090a0100d1472f06c4fd"} Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.557715 4762 scope.go:117] "RemoveContainer" containerID="9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5" Feb 17 14:11:30 crc kubenswrapper[4762]: E0217 14:11:30.558605 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5\": container with ID starting with 9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5 not found: ID does not exist" containerID="9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.558659 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5"} err="failed to get container status \"9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5\": rpc error: code = NotFound desc = could not find container \"9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5\": container with ID starting with 9e1f3c0ae0da8a37f22a4df025f63c1fb05af3cc312bcd36add334c6286a47d5 not found: ID does not exist" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.558678 4762 scope.go:117] "RemoveContainer" containerID="0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.575223 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5"] Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.578746 4762 scope.go:117] "RemoveContainer" containerID="0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875" Feb 17 14:11:30 crc kubenswrapper[4762]: E0217 14:11:30.579117 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875\": container with ID starting with 0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875 not found: ID does not exist" containerID="0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.579165 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875"} err="failed to get container status \"0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875\": rpc error: code = NotFound desc = could not find container \"0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875\": container with ID starting with 0c41496b9da5eacf6f1e78fd183b3e04cd6d47a828d83a06efada1f872596875 not found: ID does not exist" Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.579892 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-7w8m5"] Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.584769 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d"] Feb 17 14:11:30 crc kubenswrapper[4762]: I0217 14:11:30.588819 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-bq89d"] Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.616457 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj"] Feb 17 14:11:31 crc kubenswrapper[4762]: E0217 14:11:31.616722 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68936b54-d9ef-46f6-8781-f6793c92ad62" containerName="controller-manager" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.616752 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="68936b54-d9ef-46f6-8781-f6793c92ad62" containerName="controller-manager" Feb 17 14:11:31 crc kubenswrapper[4762]: E0217 14:11:31.616769 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fb52ff-a8cf-443a-81b4-c998fbdd5789" containerName="route-controller-manager" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.616775 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fb52ff-a8cf-443a-81b4-c998fbdd5789" containerName="route-controller-manager" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.616862 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fb52ff-a8cf-443a-81b4-c998fbdd5789" containerName="route-controller-manager" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.616871 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="68936b54-d9ef-46f6-8781-f6793c92ad62" containerName="controller-manager" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.617304 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.619136 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.619428 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.619584 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.619746 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.620240 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.620775 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d88455f4c-kd6g6"] Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.620926 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.621535 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.622686 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.622879 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.623654 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.623948 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.624075 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.624407 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.632839 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.636535 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj"] Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.639436 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d88455f4c-kd6g6"] Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719089 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-config\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719162 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-client-ca\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719232 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-serving-cert\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719259 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-config\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719279 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-client-ca\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719300 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl824\" (UniqueName: \"kubernetes.io/projected/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-kube-api-access-xl824\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-proxy-ca-bundles\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719469 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wn8\" (UniqueName: \"kubernetes.io/projected/562a1436-7ae3-4134-838b-9b5ddf40e2aa-kube-api-access-87wn8\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.719536 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/562a1436-7ae3-4134-838b-9b5ddf40e2aa-serving-cert\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820418 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-config\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820487 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-client-ca\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820517 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-serving-cert\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820542 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-config\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820572 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-client-ca\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820598 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl824\" (UniqueName: \"kubernetes.io/projected/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-kube-api-access-xl824\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820626 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-proxy-ca-bundles\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820661 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wn8\" (UniqueName: \"kubernetes.io/projected/562a1436-7ae3-4134-838b-9b5ddf40e2aa-kube-api-access-87wn8\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.820683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/562a1436-7ae3-4134-838b-9b5ddf40e2aa-serving-cert\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.821845 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-proxy-ca-bundles\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.822185 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-client-ca\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.822261 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/562a1436-7ae3-4134-838b-9b5ddf40e2aa-config\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.822448 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-client-ca\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.822566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-config\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.825548 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/562a1436-7ae3-4134-838b-9b5ddf40e2aa-serving-cert\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.827201 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-serving-cert\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.837917 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wn8\" (UniqueName: \"kubernetes.io/projected/562a1436-7ae3-4134-838b-9b5ddf40e2aa-kube-api-access-87wn8\") pod \"controller-manager-d88455f4c-kd6g6\" (UID: \"562a1436-7ae3-4134-838b-9b5ddf40e2aa\") " pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.842448 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl824\" (UniqueName: \"kubernetes.io/projected/d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63-kube-api-access-xl824\") pod \"route-controller-manager-5ddbcdd46b-2bkgj\" (UID: \"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63\") " pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.935417 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:31 crc kubenswrapper[4762]: I0217 14:11:31.948064 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.079027 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fb52ff-a8cf-443a-81b4-c998fbdd5789" path="/var/lib/kubelet/pods/25fb52ff-a8cf-443a-81b4-c998fbdd5789/volumes" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.080333 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68936b54-d9ef-46f6-8781-f6793c92ad62" path="/var/lib/kubelet/pods/68936b54-d9ef-46f6-8781-f6793c92ad62/volumes" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.197504 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d88455f4c-kd6g6"] Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.350463 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj"] Feb 17 14:11:32 crc kubenswrapper[4762]: W0217 14:11:32.354308 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bdfc13_eeef_4d5c_a788_1a4dc1e65d63.slice/crio-c2e32ebc03a2cbb8516074232f76891fe0c269d2fff6b0b5df5625f8f0620d5a WatchSource:0}: Error finding container c2e32ebc03a2cbb8516074232f76891fe0c269d2fff6b0b5df5625f8f0620d5a: Status 404 returned error can't find the container with id c2e32ebc03a2cbb8516074232f76891fe0c269d2fff6b0b5df5625f8f0620d5a Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.550814 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" event={"ID":"562a1436-7ae3-4134-838b-9b5ddf40e2aa","Type":"ContainerStarted","Data":"59900e91b91b5715ace315e2cd45f18d2c74ae00cf5dbf2cbc77fa93cc62c864"} Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.550872 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" event={"ID":"562a1436-7ae3-4134-838b-9b5ddf40e2aa","Type":"ContainerStarted","Data":"bbffb78ebe856b51e62c46a50f79e7807049a6d8b7ac51e9d98768743a197893"} Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.551566 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.552564 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" event={"ID":"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63","Type":"ContainerStarted","Data":"af08250c0181b8e83df95a36f9d8c834b416d380408735d8abe1cab973711e2d"} Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.552694 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" event={"ID":"d8bdfc13-eeef-4d5c-a788-1a4dc1e65d63","Type":"ContainerStarted","Data":"c2e32ebc03a2cbb8516074232f76891fe0c269d2fff6b0b5df5625f8f0620d5a"} Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.553062 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.567802 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" podStartSLOduration=3.56778503 podStartE2EDuration="3.56778503s" podCreationTimestamp="2026-02-17 14:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:32.565496233 +0000 UTC m=+373.145496885" watchObservedRunningTime="2026-02-17 14:11:32.56778503 +0000 UTC m=+373.147785682" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.573133 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d88455f4c-kd6g6" Feb 17 14:11:32 crc kubenswrapper[4762]: I0217 14:11:32.589834 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" podStartSLOduration=3.58981526 podStartE2EDuration="3.58981526s" podCreationTimestamp="2026-02-17 14:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:32.586754671 +0000 UTC m=+373.166755343" watchObservedRunningTime="2026-02-17 14:11:32.58981526 +0000 UTC m=+373.169815912" Feb 17 14:11:33 crc kubenswrapper[4762]: I0217 14:11:33.383910 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5ddbcdd46b-2bkgj" Feb 17 14:11:45 crc kubenswrapper[4762]: I0217 14:11:45.924071 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hvzzr" Feb 17 14:11:45 crc kubenswrapper[4762]: I0217 14:11:45.992976 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lm4gz"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.107713 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5h5kh"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.108530 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5h5kh" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="registry-server" containerID="cri-o://b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f" gracePeriod=30 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.115514 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpj7t"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.115894 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qpj7t" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="registry-server" containerID="cri-o://d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae" gracePeriod=30 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.124362 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xxdg7"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.124603 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" containerID="cri-o://fbb7165e310ac8915278a1ab594016ad0bdda7c965fa741a3de68c7a1fa07588" gracePeriod=30 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.135185 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb2z7"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.135759 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lb2z7" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="registry-server" containerID="cri-o://2f762ef10cb4bf7ed4d53f849ab8cb444bb18752a7e7dc38fb4e587d464d0322" gracePeriod=30 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.147116 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28cgn"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.147356 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-28cgn" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="registry-server" containerID="cri-o://ed81fdd85e7cb910429f3cf771061c13a5cc19be1f4cd90b321c2d48e0b4e9c1" gracePeriod=30 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.153267 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kpxwm"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.160061 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.170533 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kpxwm"] Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.323467 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc97x\" (UniqueName: \"kubernetes.io/projected/01244fb5-02d9-4328-ba6a-018283f64d07-kube-api-access-pc97x\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.323523 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01244fb5-02d9-4328-ba6a-018283f64d07-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.323682 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/01244fb5-02d9-4328-ba6a-018283f64d07-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.424967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/01244fb5-02d9-4328-ba6a-018283f64d07-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.425028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc97x\" (UniqueName: \"kubernetes.io/projected/01244fb5-02d9-4328-ba6a-018283f64d07-kube-api-access-pc97x\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.425062 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01244fb5-02d9-4328-ba6a-018283f64d07-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.435891 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01244fb5-02d9-4328-ba6a-018283f64d07-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.444885 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/01244fb5-02d9-4328-ba6a-018283f64d07-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.453382 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc97x\" (UniqueName: \"kubernetes.io/projected/01244fb5-02d9-4328-ba6a-018283f64d07-kube-api-access-pc97x\") pod \"marketplace-operator-79b997595-kpxwm\" (UID: \"01244fb5-02d9-4328-ba6a-018283f64d07\") " pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.637708 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.669896 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.721740 4762 generic.go:334] "Generic (PLEG): container finished" podID="23c1ddb0-986c-4801-9172-0f372eebae07" containerID="2f762ef10cb4bf7ed4d53f849ab8cb444bb18752a7e7dc38fb4e587d464d0322" exitCode=0 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.721839 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerDied","Data":"2f762ef10cb4bf7ed4d53f849ab8cb444bb18752a7e7dc38fb4e587d464d0322"} Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.735685 4762 generic.go:334] "Generic (PLEG): container finished" podID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerID="b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f" exitCode=0 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.735779 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerDied","Data":"b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f"} Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.748875 4762 generic.go:334] "Generic (PLEG): container finished" podID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerID="d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae" exitCode=0 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.748948 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerDied","Data":"d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae"} Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.748976 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpj7t" event={"ID":"17efb526-3519-4d99-bd81-cd6fed3a42aa","Type":"ContainerDied","Data":"7b02ff8b3474fab42237600397818b6b5adf0275ac76d12b1825a56fc9933952"} Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.748993 4762 scope.go:117] "RemoveContainer" containerID="d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.749102 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpj7t" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.753345 4762 generic.go:334] "Generic (PLEG): container finished" podID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerID="ed81fdd85e7cb910429f3cf771061c13a5cc19be1f4cd90b321c2d48e0b4e9c1" exitCode=0 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.753401 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerDied","Data":"ed81fdd85e7cb910429f3cf771061c13a5cc19be1f4cd90b321c2d48e0b4e9c1"} Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.755732 4762 generic.go:334] "Generic (PLEG): container finished" podID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerID="fbb7165e310ac8915278a1ab594016ad0bdda7c965fa741a3de68c7a1fa07588" exitCode=0 Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.755756 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" event={"ID":"2822ca68-2d20-4f3c-93aa-38f63a418c69","Type":"ContainerDied","Data":"fbb7165e310ac8915278a1ab594016ad0bdda7c965fa741a3de68c7a1fa07588"} Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.767366 4762 scope.go:117] "RemoveContainer" containerID="e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc" Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.773657 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f is running failed: container process not found" containerID="b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.774120 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f is running failed: container process not found" containerID="b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.775079 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f is running failed: container process not found" containerID="b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.775142 4762 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-5h5kh" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="registry-server" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.814840 4762 scope.go:117] "RemoveContainer" containerID="2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.838543 4762 scope.go:117] "RemoveContainer" containerID="d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae" Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.839413 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae\": container with ID starting with d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae not found: ID does not exist" containerID="d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.839447 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae"} err="failed to get container status \"d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae\": rpc error: code = NotFound desc = could not find container \"d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae\": container with ID starting with d5849f525391be6d4e8c3489468e557779c3f2f635bcccbc0c74a1a83aaa74ae not found: ID does not exist" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.839467 4762 scope.go:117] "RemoveContainer" containerID="e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc" Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.840812 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc\": container with ID starting with e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc not found: ID does not exist" containerID="e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.840844 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc"} err="failed to get container status \"e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc\": rpc error: code = NotFound desc = could not find container \"e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc\": container with ID starting with e782c548798c734305c61881a5403ef3fc4cd163305a50604b37091f0a7640cc not found: ID does not exist" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.840857 4762 scope.go:117] "RemoveContainer" containerID="2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762" Feb 17 14:11:49 crc kubenswrapper[4762]: E0217 14:11:49.841372 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762\": container with ID starting with 2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762 not found: ID does not exist" containerID="2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.841389 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762"} err="failed to get container status \"2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762\": rpc error: code = NotFound desc = could not find container \"2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762\": container with ID starting with 2e30a1588667d961c27d5b743083e6ba71b330d192444ed1750471e0671d3762 not found: ID does not exist" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.841402 4762 scope.go:117] "RemoveContainer" containerID="7a6ea7dcc9688017aa6d85d9918ae68333a411dddb372839ae3e4d61cf15c960" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.844274 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t5ld\" (UniqueName: \"kubernetes.io/projected/17efb526-3519-4d99-bd81-cd6fed3a42aa-kube-api-access-6t5ld\") pod \"17efb526-3519-4d99-bd81-cd6fed3a42aa\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.844430 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-utilities\") pod \"17efb526-3519-4d99-bd81-cd6fed3a42aa\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.844477 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-catalog-content\") pod \"17efb526-3519-4d99-bd81-cd6fed3a42aa\" (UID: \"17efb526-3519-4d99-bd81-cd6fed3a42aa\") " Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.848488 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-utilities" (OuterVolumeSpecName: "utilities") pod "17efb526-3519-4d99-bd81-cd6fed3a42aa" (UID: "17efb526-3519-4d99-bd81-cd6fed3a42aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.856156 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17efb526-3519-4d99-bd81-cd6fed3a42aa-kube-api-access-6t5ld" (OuterVolumeSpecName: "kube-api-access-6t5ld") pod "17efb526-3519-4d99-bd81-cd6fed3a42aa" (UID: "17efb526-3519-4d99-bd81-cd6fed3a42aa"). InnerVolumeSpecName "kube-api-access-6t5ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.909373 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.910077 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.917191 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.929271 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17efb526-3519-4d99-bd81-cd6fed3a42aa" (UID: "17efb526-3519-4d99-bd81-cd6fed3a42aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.945974 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.945999 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17efb526-3519-4d99-bd81-cd6fed3a42aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.946010 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t5ld\" (UniqueName: \"kubernetes.io/projected/17efb526-3519-4d99-bd81-cd6fed3a42aa-kube-api-access-6t5ld\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:49 crc kubenswrapper[4762]: I0217 14:11:49.950974 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-utilities\") pod \"490d6026-4fbb-49b1-993c-09dd3e60db65\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046773 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-catalog-content\") pod \"ea39a651-661f-4d01-9420-71469f5d2b8c\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046809 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-utilities\") pod \"23c1ddb0-986c-4801-9172-0f372eebae07\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046842 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-catalog-content\") pod \"490d6026-4fbb-49b1-993c-09dd3e60db65\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046873 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-trusted-ca\") pod \"2822ca68-2d20-4f3c-93aa-38f63a418c69\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046906 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mggzp\" (UniqueName: \"kubernetes.io/projected/490d6026-4fbb-49b1-993c-09dd3e60db65-kube-api-access-mggzp\") pod \"490d6026-4fbb-49b1-993c-09dd3e60db65\" (UID: \"490d6026-4fbb-49b1-993c-09dd3e60db65\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046926 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsl57\" (UniqueName: \"kubernetes.io/projected/2822ca68-2d20-4f3c-93aa-38f63a418c69-kube-api-access-gsl57\") pod \"2822ca68-2d20-4f3c-93aa-38f63a418c69\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046952 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpjh2\" (UniqueName: \"kubernetes.io/projected/ea39a651-661f-4d01-9420-71469f5d2b8c-kube-api-access-tpjh2\") pod \"ea39a651-661f-4d01-9420-71469f5d2b8c\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.046978 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-operator-metrics\") pod \"2822ca68-2d20-4f3c-93aa-38f63a418c69\" (UID: \"2822ca68-2d20-4f3c-93aa-38f63a418c69\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.047013 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sqzs\" (UniqueName: \"kubernetes.io/projected/23c1ddb0-986c-4801-9172-0f372eebae07-kube-api-access-5sqzs\") pod \"23c1ddb0-986c-4801-9172-0f372eebae07\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.047052 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-catalog-content\") pod \"23c1ddb0-986c-4801-9172-0f372eebae07\" (UID: \"23c1ddb0-986c-4801-9172-0f372eebae07\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.047075 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-utilities\") pod \"ea39a651-661f-4d01-9420-71469f5d2b8c\" (UID: \"ea39a651-661f-4d01-9420-71469f5d2b8c\") " Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.048200 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2822ca68-2d20-4f3c-93aa-38f63a418c69" (UID: "2822ca68-2d20-4f3c-93aa-38f63a418c69"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.049038 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-utilities" (OuterVolumeSpecName: "utilities") pod "ea39a651-661f-4d01-9420-71469f5d2b8c" (UID: "ea39a651-661f-4d01-9420-71469f5d2b8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.049290 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-utilities" (OuterVolumeSpecName: "utilities") pod "490d6026-4fbb-49b1-993c-09dd3e60db65" (UID: "490d6026-4fbb-49b1-993c-09dd3e60db65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.050266 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490d6026-4fbb-49b1-993c-09dd3e60db65-kube-api-access-mggzp" (OuterVolumeSpecName: "kube-api-access-mggzp") pod "490d6026-4fbb-49b1-993c-09dd3e60db65" (UID: "490d6026-4fbb-49b1-993c-09dd3e60db65"). InnerVolumeSpecName "kube-api-access-mggzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.052105 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2822ca68-2d20-4f3c-93aa-38f63a418c69-kube-api-access-gsl57" (OuterVolumeSpecName: "kube-api-access-gsl57") pod "2822ca68-2d20-4f3c-93aa-38f63a418c69" (UID: "2822ca68-2d20-4f3c-93aa-38f63a418c69"). InnerVolumeSpecName "kube-api-access-gsl57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.052822 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2822ca68-2d20-4f3c-93aa-38f63a418c69" (UID: "2822ca68-2d20-4f3c-93aa-38f63a418c69"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.055529 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-utilities" (OuterVolumeSpecName: "utilities") pod "23c1ddb0-986c-4801-9172-0f372eebae07" (UID: "23c1ddb0-986c-4801-9172-0f372eebae07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.057280 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c1ddb0-986c-4801-9172-0f372eebae07-kube-api-access-5sqzs" (OuterVolumeSpecName: "kube-api-access-5sqzs") pod "23c1ddb0-986c-4801-9172-0f372eebae07" (UID: "23c1ddb0-986c-4801-9172-0f372eebae07"). InnerVolumeSpecName "kube-api-access-5sqzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.057507 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea39a651-661f-4d01-9420-71469f5d2b8c-kube-api-access-tpjh2" (OuterVolumeSpecName: "kube-api-access-tpjh2") pod "ea39a651-661f-4d01-9420-71469f5d2b8c" (UID: "ea39a651-661f-4d01-9420-71469f5d2b8c"). InnerVolumeSpecName "kube-api-access-tpjh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.086872 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23c1ddb0-986c-4801-9172-0f372eebae07" (UID: "23c1ddb0-986c-4801-9172-0f372eebae07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.097218 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpj7t"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.100434 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qpj7t"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.121397 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea39a651-661f-4d01-9420-71469f5d2b8c" (UID: "ea39a651-661f-4d01-9420-71469f5d2b8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153348 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153383 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153393 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153403 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153412 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mggzp\" (UniqueName: \"kubernetes.io/projected/490d6026-4fbb-49b1-993c-09dd3e60db65-kube-api-access-mggzp\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153421 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsl57\" (UniqueName: \"kubernetes.io/projected/2822ca68-2d20-4f3c-93aa-38f63a418c69-kube-api-access-gsl57\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153430 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpjh2\" (UniqueName: \"kubernetes.io/projected/ea39a651-661f-4d01-9420-71469f5d2b8c-kube-api-access-tpjh2\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153437 4762 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2822ca68-2d20-4f3c-93aa-38f63a418c69-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153456 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sqzs\" (UniqueName: \"kubernetes.io/projected/23c1ddb0-986c-4801-9172-0f372eebae07-kube-api-access-5sqzs\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153464 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23c1ddb0-986c-4801-9172-0f372eebae07-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.153472 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea39a651-661f-4d01-9420-71469f5d2b8c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.215385 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "490d6026-4fbb-49b1-993c-09dd3e60db65" (UID: "490d6026-4fbb-49b1-993c-09dd3e60db65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.244703 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kpxwm"] Feb 17 14:11:50 crc kubenswrapper[4762]: W0217 14:11:50.248521 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01244fb5_02d9_4328_ba6a_018283f64d07.slice/crio-4055384d5005b535fca95acd4b3bf2afbe8953b865b76491f5a8322a23e2c578 WatchSource:0}: Error finding container 4055384d5005b535fca95acd4b3bf2afbe8953b865b76491f5a8322a23e2c578: Status 404 returned error can't find the container with id 4055384d5005b535fca95acd4b3bf2afbe8953b865b76491f5a8322a23e2c578 Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.254911 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d6026-4fbb-49b1-993c-09dd3e60db65-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.763109 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28cgn" event={"ID":"490d6026-4fbb-49b1-993c-09dd3e60db65","Type":"ContainerDied","Data":"95b7e3a89d7aa8fadf37ea9bf243e120b4c22021f16b6095b9fc4ba4e9574fa0"} Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.763237 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28cgn" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.763413 4762 scope.go:117] "RemoveContainer" containerID="ed81fdd85e7cb910429f3cf771061c13a5cc19be1f4cd90b321c2d48e0b4e9c1" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.765451 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb2z7" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.765460 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb2z7" event={"ID":"23c1ddb0-986c-4801-9172-0f372eebae07","Type":"ContainerDied","Data":"f977244a4c9ab995537d8980dba05a1b1b3ec3d4364b7c182eec382a42012338"} Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.768962 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" event={"ID":"2822ca68-2d20-4f3c-93aa-38f63a418c69","Type":"ContainerDied","Data":"425ec11b65afba8e7bc2b7b9c11829e3a3d45eb87429259d90d806e5f2f8eeef"} Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.769031 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xxdg7" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.771458 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5h5kh" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.771768 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5h5kh" event={"ID":"ea39a651-661f-4d01-9420-71469f5d2b8c","Type":"ContainerDied","Data":"50431a81480dca1d5aa8be321acb74024d022bb437e7fdb55f27dcaa9320d695"} Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.777124 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" event={"ID":"01244fb5-02d9-4328-ba6a-018283f64d07","Type":"ContainerStarted","Data":"31a2ed86005d4fc4dce6c15c50641ec3127557a380af96e5501742ac6dfd07ab"} Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.777546 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" event={"ID":"01244fb5-02d9-4328-ba6a-018283f64d07","Type":"ContainerStarted","Data":"4055384d5005b535fca95acd4b3bf2afbe8953b865b76491f5a8322a23e2c578"} Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.778219 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.784321 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.784782 4762 scope.go:117] "RemoveContainer" containerID="dffdf1b369e5e57cd2eddd1e31fcfc7853467ca7cbac06acb97d54866e17738a" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.802487 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kpxwm" podStartSLOduration=1.8024656540000001 podStartE2EDuration="1.802465654s" podCreationTimestamp="2026-02-17 14:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:50.801813768 +0000 UTC m=+391.381814440" watchObservedRunningTime="2026-02-17 14:11:50.802465654 +0000 UTC m=+391.382466306" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.825703 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xxdg7"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.838043 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xxdg7"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.843017 4762 scope.go:117] "RemoveContainer" containerID="9ecff109aa58a217903f0d52a20f142acec4e3dcc4ea14415a3552896acdc421" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.843114 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb2z7"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.848025 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb2z7"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.871400 4762 scope.go:117] "RemoveContainer" containerID="2f762ef10cb4bf7ed4d53f849ab8cb444bb18752a7e7dc38fb4e587d464d0322" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.876867 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28cgn"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.880284 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-28cgn"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.918717 4762 scope.go:117] "RemoveContainer" containerID="86710bb5aafd789e3f8fffcae0fcafc14bfefc204b8dc7713dd0ed34f0b475d7" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.919539 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5h5kh"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.923255 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5h5kh"] Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.935169 4762 scope.go:117] "RemoveContainer" containerID="4f17dc0df37f3cd997ff008f30518b534ddf83822773d5e1bcf48f229630bbc6" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.953496 4762 scope.go:117] "RemoveContainer" containerID="fbb7165e310ac8915278a1ab594016ad0bdda7c965fa741a3de68c7a1fa07588" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.965441 4762 scope.go:117] "RemoveContainer" containerID="b1684888109399e8c09fe2e38fcf123377678d236537e6f1783a4fea87d95b5f" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.978220 4762 scope.go:117] "RemoveContainer" containerID="7f9dc20df7254a2d47c8b057031e67b139b4594ce641f4922ffb9d61fbb61c8d" Feb 17 14:11:50 crc kubenswrapper[4762]: I0217 14:11:50.993078 4762 scope.go:117] "RemoveContainer" containerID="a3917a426f245b435d453bce4d32b069cf10e28751f43a04699450c57e15258d" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.723826 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8przg"] Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724413 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724429 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724442 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724449 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724459 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724465 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724475 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724483 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724494 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724500 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724510 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724516 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724523 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724529 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724541 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724547 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724555 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724561 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="extract-content" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724568 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724575 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724583 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724590 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724600 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724606 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="extract-utilities" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724620 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724627 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: E0217 14:11:51.724637 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724662 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724772 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724856 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724867 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724896 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.724906 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" containerName="registry-server" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.725109 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" containerName="marketplace-operator" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.725983 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.731703 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.738278 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8przg"] Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.774429 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-utilities\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.774515 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-catalog-content\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.774543 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtmb\" (UniqueName: \"kubernetes.io/projected/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-kube-api-access-7dtmb\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.875609 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtmb\" (UniqueName: \"kubernetes.io/projected/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-kube-api-access-7dtmb\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.875758 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-utilities\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.875918 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-catalog-content\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.876478 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-catalog-content\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.876516 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-utilities\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:51 crc kubenswrapper[4762]: I0217 14:11:51.893790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtmb\" (UniqueName: \"kubernetes.io/projected/197d8c37-eac6-4f4a-9f95-fa1da2ff23e7-kube-api-access-7dtmb\") pod \"redhat-marketplace-8przg\" (UID: \"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7\") " pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.045183 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.080474 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17efb526-3519-4d99-bd81-cd6fed3a42aa" path="/var/lib/kubelet/pods/17efb526-3519-4d99-bd81-cd6fed3a42aa/volumes" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.081422 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c1ddb0-986c-4801-9172-0f372eebae07" path="/var/lib/kubelet/pods/23c1ddb0-986c-4801-9172-0f372eebae07/volumes" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.082227 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2822ca68-2d20-4f3c-93aa-38f63a418c69" path="/var/lib/kubelet/pods/2822ca68-2d20-4f3c-93aa-38f63a418c69/volumes" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.083359 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490d6026-4fbb-49b1-993c-09dd3e60db65" path="/var/lib/kubelet/pods/490d6026-4fbb-49b1-993c-09dd3e60db65/volumes" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.084092 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea39a651-661f-4d01-9420-71469f5d2b8c" path="/var/lib/kubelet/pods/ea39a651-661f-4d01-9420-71469f5d2b8c/volumes" Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.461304 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8przg"] Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.794895 4762 generic.go:334] "Generic (PLEG): container finished" podID="197d8c37-eac6-4f4a-9f95-fa1da2ff23e7" containerID="e50b499241c334a6a97ca02a780ba254d59b3fcb1fcf8e0ae4b44b2a30dacd4b" exitCode=0 Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.794977 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8przg" event={"ID":"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7","Type":"ContainerDied","Data":"e50b499241c334a6a97ca02a780ba254d59b3fcb1fcf8e0ae4b44b2a30dacd4b"} Feb 17 14:11:52 crc kubenswrapper[4762]: I0217 14:11:52.795020 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8przg" event={"ID":"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7","Type":"ContainerStarted","Data":"113dc26180eb61dda3a362abed2a113a2adf1feeb972d7f5abe62f4e04e5ce16"} Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.125205 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g66qj"] Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.126748 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.128418 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.142876 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g66qj"] Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.292922 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440d9e9b-109c-4794-93b8-e18e3232ad49-catalog-content\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.293067 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjt8d\" (UniqueName: \"kubernetes.io/projected/440d9e9b-109c-4794-93b8-e18e3232ad49-kube-api-access-rjt8d\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.293175 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440d9e9b-109c-4794-93b8-e18e3232ad49-utilities\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.394398 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440d9e9b-109c-4794-93b8-e18e3232ad49-catalog-content\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.394487 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjt8d\" (UniqueName: \"kubernetes.io/projected/440d9e9b-109c-4794-93b8-e18e3232ad49-kube-api-access-rjt8d\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.394533 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440d9e9b-109c-4794-93b8-e18e3232ad49-utilities\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.395400 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440d9e9b-109c-4794-93b8-e18e3232ad49-catalog-content\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.395397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440d9e9b-109c-4794-93b8-e18e3232ad49-utilities\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.427623 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjt8d\" (UniqueName: \"kubernetes.io/projected/440d9e9b-109c-4794-93b8-e18e3232ad49-kube-api-access-rjt8d\") pod \"redhat-operators-g66qj\" (UID: \"440d9e9b-109c-4794-93b8-e18e3232ad49\") " pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.449551 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:11:53 crc kubenswrapper[4762]: I0217 14:11:53.876974 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g66qj"] Feb 17 14:11:53 crc kubenswrapper[4762]: W0217 14:11:53.881713 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440d9e9b_109c_4794_93b8_e18e3232ad49.slice/crio-8cc352ab91aed7bd95b3dd492f922a9f15796f7ebbb0559b5547631e602839be WatchSource:0}: Error finding container 8cc352ab91aed7bd95b3dd492f922a9f15796f7ebbb0559b5547631e602839be: Status 404 returned error can't find the container with id 8cc352ab91aed7bd95b3dd492f922a9f15796f7ebbb0559b5547631e602839be Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.119933 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-blnm9"] Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.122923 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.126551 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.129218 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-blnm9"] Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.303931 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e8a03a-97a3-4727-84ef-9683f533aa17-catalog-content\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.304585 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8l6b\" (UniqueName: \"kubernetes.io/projected/c3e8a03a-97a3-4727-84ef-9683f533aa17-kube-api-access-g8l6b\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.304851 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e8a03a-97a3-4727-84ef-9683f533aa17-utilities\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.405754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e8a03a-97a3-4727-84ef-9683f533aa17-catalog-content\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.406065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8l6b\" (UniqueName: \"kubernetes.io/projected/c3e8a03a-97a3-4727-84ef-9683f533aa17-kube-api-access-g8l6b\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.406230 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e8a03a-97a3-4727-84ef-9683f533aa17-utilities\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.406253 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e8a03a-97a3-4727-84ef-9683f533aa17-catalog-content\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.406454 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e8a03a-97a3-4727-84ef-9683f533aa17-utilities\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.425279 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8l6b\" (UniqueName: \"kubernetes.io/projected/c3e8a03a-97a3-4727-84ef-9683f533aa17-kube-api-access-g8l6b\") pod \"community-operators-blnm9\" (UID: \"c3e8a03a-97a3-4727-84ef-9683f533aa17\") " pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.480002 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.621980 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.622334 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.808435 4762 generic.go:334] "Generic (PLEG): container finished" podID="197d8c37-eac6-4f4a-9f95-fa1da2ff23e7" containerID="c35c1f09606f1ab45621ef9fc93841ef12c4fb0da9e3ab58f5347d4a262d1610" exitCode=0 Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.808573 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8przg" event={"ID":"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7","Type":"ContainerDied","Data":"c35c1f09606f1ab45621ef9fc93841ef12c4fb0da9e3ab58f5347d4a262d1610"} Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.813970 4762 generic.go:334] "Generic (PLEG): container finished" podID="440d9e9b-109c-4794-93b8-e18e3232ad49" containerID="29fa4aa5da317d11207afe13bb04a3f8b1cae97004203db4097c8210eba3556e" exitCode=0 Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.814006 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g66qj" event={"ID":"440d9e9b-109c-4794-93b8-e18e3232ad49","Type":"ContainerDied","Data":"29fa4aa5da317d11207afe13bb04a3f8b1cae97004203db4097c8210eba3556e"} Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.814029 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g66qj" event={"ID":"440d9e9b-109c-4794-93b8-e18e3232ad49","Type":"ContainerStarted","Data":"8cc352ab91aed7bd95b3dd492f922a9f15796f7ebbb0559b5547631e602839be"} Feb 17 14:11:54 crc kubenswrapper[4762]: W0217 14:11:54.871039 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e8a03a_97a3_4727_84ef_9683f533aa17.slice/crio-93236671964a2fbc3cc2bb95a592e1e60907021c7d25c6438ff641e4a546bc73 WatchSource:0}: Error finding container 93236671964a2fbc3cc2bb95a592e1e60907021c7d25c6438ff641e4a546bc73: Status 404 returned error can't find the container with id 93236671964a2fbc3cc2bb95a592e1e60907021c7d25c6438ff641e4a546bc73 Feb 17 14:11:54 crc kubenswrapper[4762]: I0217 14:11:54.873558 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-blnm9"] Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.521990 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hrk6m"] Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.523151 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.527290 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.529291 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrk6m"] Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.722366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcvl2\" (UniqueName: \"kubernetes.io/projected/f2458360-5ec8-41fa-a098-9cf66b726192-kube-api-access-pcvl2\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.722697 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2458360-5ec8-41fa-a098-9cf66b726192-utilities\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.722727 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2458360-5ec8-41fa-a098-9cf66b726192-catalog-content\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.820332 4762 generic.go:334] "Generic (PLEG): container finished" podID="c3e8a03a-97a3-4727-84ef-9683f533aa17" containerID="158d6dc8e7194e12e6398ac7d8006925d5522c57b1441d4a52eda30614b0daaf" exitCode=0 Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.820415 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blnm9" event={"ID":"c3e8a03a-97a3-4727-84ef-9683f533aa17","Type":"ContainerDied","Data":"158d6dc8e7194e12e6398ac7d8006925d5522c57b1441d4a52eda30614b0daaf"} Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.820447 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blnm9" event={"ID":"c3e8a03a-97a3-4727-84ef-9683f533aa17","Type":"ContainerStarted","Data":"93236671964a2fbc3cc2bb95a592e1e60907021c7d25c6438ff641e4a546bc73"} Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.823358 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8przg" event={"ID":"197d8c37-eac6-4f4a-9f95-fa1da2ff23e7","Type":"ContainerStarted","Data":"ea3e9e0ba9fea86dde69c0c481ed1702573d3d2b3d07a144328d180b73872069"} Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.824167 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcvl2\" (UniqueName: \"kubernetes.io/projected/f2458360-5ec8-41fa-a098-9cf66b726192-kube-api-access-pcvl2\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.824245 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2458360-5ec8-41fa-a098-9cf66b726192-utilities\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.824273 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2458360-5ec8-41fa-a098-9cf66b726192-catalog-content\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.824850 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2458360-5ec8-41fa-a098-9cf66b726192-catalog-content\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.824851 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2458360-5ec8-41fa-a098-9cf66b726192-utilities\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.847980 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcvl2\" (UniqueName: \"kubernetes.io/projected/f2458360-5ec8-41fa-a098-9cf66b726192-kube-api-access-pcvl2\") pod \"certified-operators-hrk6m\" (UID: \"f2458360-5ec8-41fa-a098-9cf66b726192\") " pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:55 crc kubenswrapper[4762]: I0217 14:11:55.872574 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8przg" podStartSLOduration=2.139296896 podStartE2EDuration="4.872555915s" podCreationTimestamp="2026-02-17 14:11:51 +0000 UTC" firstStartedPulling="2026-02-17 14:11:52.797219922 +0000 UTC m=+393.377220574" lastFinishedPulling="2026-02-17 14:11:55.530478941 +0000 UTC m=+396.110479593" observedRunningTime="2026-02-17 14:11:55.870247428 +0000 UTC m=+396.450248070" watchObservedRunningTime="2026-02-17 14:11:55.872555915 +0000 UTC m=+396.452556567" Feb 17 14:11:56 crc kubenswrapper[4762]: I0217 14:11:56.145956 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:11:56 crc kubenswrapper[4762]: I0217 14:11:56.546774 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrk6m"] Feb 17 14:11:56 crc kubenswrapper[4762]: I0217 14:11:56.829012 4762 generic.go:334] "Generic (PLEG): container finished" podID="f2458360-5ec8-41fa-a098-9cf66b726192" containerID="09a070db98467c2561f0e0899b9792c3d7d46e33ad3349a26cf9fc94f91f02ad" exitCode=0 Feb 17 14:11:56 crc kubenswrapper[4762]: I0217 14:11:56.829077 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrk6m" event={"ID":"f2458360-5ec8-41fa-a098-9cf66b726192","Type":"ContainerDied","Data":"09a070db98467c2561f0e0899b9792c3d7d46e33ad3349a26cf9fc94f91f02ad"} Feb 17 14:11:56 crc kubenswrapper[4762]: I0217 14:11:56.829336 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrk6m" event={"ID":"f2458360-5ec8-41fa-a098-9cf66b726192","Type":"ContainerStarted","Data":"2eaace2242184f5b7647e4d066a1431f746f5e367e6860471f9ce7e1ab26c1f6"} Feb 17 14:11:57 crc kubenswrapper[4762]: I0217 14:11:57.837218 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g66qj" event={"ID":"440d9e9b-109c-4794-93b8-e18e3232ad49","Type":"ContainerStarted","Data":"e5e537bdf6061284eab290b7cd5350cf06d6e83d6f1502321fab1fe3dc3ed6f2"} Feb 17 14:11:57 crc kubenswrapper[4762]: I0217 14:11:57.849765 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blnm9" event={"ID":"c3e8a03a-97a3-4727-84ef-9683f533aa17","Type":"ContainerStarted","Data":"4e2591991a14289209f63e4716f16178b330223a620e8b49b2a5f182ddf228fc"} Feb 17 14:11:58 crc kubenswrapper[4762]: I0217 14:11:58.856936 4762 generic.go:334] "Generic (PLEG): container finished" podID="440d9e9b-109c-4794-93b8-e18e3232ad49" containerID="e5e537bdf6061284eab290b7cd5350cf06d6e83d6f1502321fab1fe3dc3ed6f2" exitCode=0 Feb 17 14:11:58 crc kubenswrapper[4762]: I0217 14:11:58.857122 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g66qj" event={"ID":"440d9e9b-109c-4794-93b8-e18e3232ad49","Type":"ContainerDied","Data":"e5e537bdf6061284eab290b7cd5350cf06d6e83d6f1502321fab1fe3dc3ed6f2"} Feb 17 14:11:58 crc kubenswrapper[4762]: I0217 14:11:58.861102 4762 generic.go:334] "Generic (PLEG): container finished" podID="c3e8a03a-97a3-4727-84ef-9683f533aa17" containerID="4e2591991a14289209f63e4716f16178b330223a620e8b49b2a5f182ddf228fc" exitCode=0 Feb 17 14:11:58 crc kubenswrapper[4762]: I0217 14:11:58.861163 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blnm9" event={"ID":"c3e8a03a-97a3-4727-84ef-9683f533aa17","Type":"ContainerDied","Data":"4e2591991a14289209f63e4716f16178b330223a620e8b49b2a5f182ddf228fc"} Feb 17 14:11:58 crc kubenswrapper[4762]: I0217 14:11:58.863830 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrk6m" event={"ID":"f2458360-5ec8-41fa-a098-9cf66b726192","Type":"ContainerStarted","Data":"a69515c492cc14edab11f4a0a2144142aaaddb3c86994c5286f5429920cc6945"} Feb 17 14:11:59 crc kubenswrapper[4762]: I0217 14:11:59.871771 4762 generic.go:334] "Generic (PLEG): container finished" podID="f2458360-5ec8-41fa-a098-9cf66b726192" containerID="a69515c492cc14edab11f4a0a2144142aaaddb3c86994c5286f5429920cc6945" exitCode=0 Feb 17 14:11:59 crc kubenswrapper[4762]: I0217 14:11:59.871849 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrk6m" event={"ID":"f2458360-5ec8-41fa-a098-9cf66b726192","Type":"ContainerDied","Data":"a69515c492cc14edab11f4a0a2144142aaaddb3c86994c5286f5429920cc6945"} Feb 17 14:11:59 crc kubenswrapper[4762]: I0217 14:11:59.878092 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g66qj" event={"ID":"440d9e9b-109c-4794-93b8-e18e3232ad49","Type":"ContainerStarted","Data":"28189c977777587cfd06d72c274627b888df5e3373274e45791b368c6530ac1f"} Feb 17 14:11:59 crc kubenswrapper[4762]: I0217 14:11:59.881093 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blnm9" event={"ID":"c3e8a03a-97a3-4727-84ef-9683f533aa17","Type":"ContainerStarted","Data":"f34f7c76b65c6c6243fc2b32daf62b42e267896887515f972f77f09db23c4b9d"} Feb 17 14:11:59 crc kubenswrapper[4762]: I0217 14:11:59.913659 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g66qj" podStartSLOduration=2.48094857 podStartE2EDuration="6.913619865s" podCreationTimestamp="2026-02-17 14:11:53 +0000 UTC" firstStartedPulling="2026-02-17 14:11:54.815819105 +0000 UTC m=+395.395819747" lastFinishedPulling="2026-02-17 14:11:59.24849039 +0000 UTC m=+399.828491042" observedRunningTime="2026-02-17 14:11:59.912865317 +0000 UTC m=+400.492865979" watchObservedRunningTime="2026-02-17 14:11:59.913619865 +0000 UTC m=+400.493620517" Feb 17 14:11:59 crc kubenswrapper[4762]: I0217 14:11:59.934551 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-blnm9" podStartSLOduration=2.219748837 podStartE2EDuration="5.934526316s" podCreationTimestamp="2026-02-17 14:11:54 +0000 UTC" firstStartedPulling="2026-02-17 14:11:55.821948446 +0000 UTC m=+396.401949118" lastFinishedPulling="2026-02-17 14:11:59.536725955 +0000 UTC m=+400.116726597" observedRunningTime="2026-02-17 14:11:59.930978208 +0000 UTC m=+400.510978870" watchObservedRunningTime="2026-02-17 14:11:59.934526316 +0000 UTC m=+400.514526968" Feb 17 14:12:00 crc kubenswrapper[4762]: I0217 14:12:00.897006 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrk6m" event={"ID":"f2458360-5ec8-41fa-a098-9cf66b726192","Type":"ContainerStarted","Data":"d7745b9bc00eb7bf51469fd2953e05d302edc00c2c75371b65ba857a0e4f4377"} Feb 17 14:12:02 crc kubenswrapper[4762]: I0217 14:12:02.045612 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:12:02 crc kubenswrapper[4762]: I0217 14:12:02.046040 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:12:02 crc kubenswrapper[4762]: I0217 14:12:02.104857 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:12:02 crc kubenswrapper[4762]: I0217 14:12:02.128143 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hrk6m" podStartSLOduration=3.655579584 podStartE2EDuration="7.128119804s" podCreationTimestamp="2026-02-17 14:11:55 +0000 UTC" firstStartedPulling="2026-02-17 14:11:56.830471417 +0000 UTC m=+397.410472069" lastFinishedPulling="2026-02-17 14:12:00.303011637 +0000 UTC m=+400.883012289" observedRunningTime="2026-02-17 14:12:00.918869366 +0000 UTC m=+401.498870028" watchObservedRunningTime="2026-02-17 14:12:02.128119804 +0000 UTC m=+402.708120456" Feb 17 14:12:03 crc kubenswrapper[4762]: I0217 14:12:03.033774 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8przg" Feb 17 14:12:03 crc kubenswrapper[4762]: I0217 14:12:03.450147 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:12:03 crc kubenswrapper[4762]: I0217 14:12:03.450495 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:12:04 crc kubenswrapper[4762]: I0217 14:12:04.480512 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:12:04 crc kubenswrapper[4762]: I0217 14:12:04.480567 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:12:04 crc kubenswrapper[4762]: I0217 14:12:04.495441 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g66qj" podUID="440d9e9b-109c-4794-93b8-e18e3232ad49" containerName="registry-server" probeResult="failure" output=< Feb 17 14:12:04 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:12:04 crc kubenswrapper[4762]: > Feb 17 14:12:04 crc kubenswrapper[4762]: I0217 14:12:04.521689 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:12:05 crc kubenswrapper[4762]: I0217 14:12:05.061462 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-blnm9" Feb 17 14:12:06 crc kubenswrapper[4762]: I0217 14:12:06.146416 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:12:06 crc kubenswrapper[4762]: I0217 14:12:06.146595 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:12:06 crc kubenswrapper[4762]: I0217 14:12:06.192755 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:12:07 crc kubenswrapper[4762]: I0217 14:12:07.058969 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hrk6m" Feb 17 14:12:11 crc kubenswrapper[4762]: I0217 14:12:11.034990 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" podUID="6c354ccb-6431-46df-a43d-d3e97f3529ae" containerName="registry" containerID="cri-o://364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07" gracePeriod=30 Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.508429 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.691171 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p79k9\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-kube-api-access-p79k9\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.691265 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-trusted-ca\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.692117 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.692275 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.692629 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-bound-sa-token\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.692819 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-certificates\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.692861 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-tls\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.692923 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c354ccb-6431-46df-a43d-d3e97f3529ae-installation-pull-secrets\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.693342 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.693562 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c354ccb-6431-46df-a43d-d3e97f3529ae-ca-trust-extracted\") pod \"6c354ccb-6431-46df-a43d-d3e97f3529ae\" (UID: \"6c354ccb-6431-46df-a43d-d3e97f3529ae\") " Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.694140 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.694165 4762 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.697256 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.699771 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c354ccb-6431-46df-a43d-d3e97f3529ae-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.700079 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-kube-api-access-p79k9" (OuterVolumeSpecName: "kube-api-access-p79k9") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "kube-api-access-p79k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.700254 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.702033 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.709915 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c354ccb-6431-46df-a43d-d3e97f3529ae-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6c354ccb-6431-46df-a43d-d3e97f3529ae" (UID: "6c354ccb-6431-46df-a43d-d3e97f3529ae"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.795960 4762 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.796006 4762 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.796022 4762 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c354ccb-6431-46df-a43d-d3e97f3529ae-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.796033 4762 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c354ccb-6431-46df-a43d-d3e97f3529ae-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:12 crc kubenswrapper[4762]: I0217 14:12:12.796042 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p79k9\" (UniqueName: \"kubernetes.io/projected/6c354ccb-6431-46df-a43d-d3e97f3529ae-kube-api-access-p79k9\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.058834 4762 generic.go:334] "Generic (PLEG): container finished" podID="6c354ccb-6431-46df-a43d-d3e97f3529ae" containerID="364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07" exitCode=0 Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.058929 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.058913 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" event={"ID":"6c354ccb-6431-46df-a43d-d3e97f3529ae","Type":"ContainerDied","Data":"364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07"} Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.059082 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lm4gz" event={"ID":"6c354ccb-6431-46df-a43d-d3e97f3529ae","Type":"ContainerDied","Data":"9b5980c9d8a065bcd4209997c1ae2ce7fe63b4f509b7f39019b517657c34910b"} Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.059103 4762 scope.go:117] "RemoveContainer" containerID="364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07" Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.074747 4762 scope.go:117] "RemoveContainer" containerID="364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07" Feb 17 14:12:13 crc kubenswrapper[4762]: E0217 14:12:13.075352 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07\": container with ID starting with 364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07 not found: ID does not exist" containerID="364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07" Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.075391 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07"} err="failed to get container status \"364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07\": rpc error: code = NotFound desc = could not find container \"364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07\": container with ID starting with 364efef270ed3ce173e748d839acd84ce2a302789ed8a1627ceb9b0e35f69b07 not found: ID does not exist" Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.092355 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lm4gz"] Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.099386 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lm4gz"] Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.492076 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:12:13 crc kubenswrapper[4762]: I0217 14:12:13.536446 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g66qj" Feb 17 14:12:14 crc kubenswrapper[4762]: I0217 14:12:14.084581 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c354ccb-6431-46df-a43d-d3e97f3529ae" path="/var/lib/kubelet/pods/6c354ccb-6431-46df-a43d-d3e97f3529ae/volumes" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.802912 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf"] Feb 17 14:12:20 crc kubenswrapper[4762]: E0217 14:12:20.804863 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c354ccb-6431-46df-a43d-d3e97f3529ae" containerName="registry" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.804881 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c354ccb-6431-46df-a43d-d3e97f3529ae" containerName="registry" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.804992 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c354ccb-6431-46df-a43d-d3e97f3529ae" containerName="registry" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.805360 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.808528 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.808571 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.808587 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.808788 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.808856 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.811405 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf"] Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.922812 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.922864 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlc8s\" (UniqueName: \"kubernetes.io/projected/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-kube-api-access-jlc8s\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:20 crc kubenswrapper[4762]: I0217 14:12:20.922907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.024547 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.024602 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlc8s\" (UniqueName: \"kubernetes.io/projected/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-kube-api-access-jlc8s\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.024657 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.025749 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.031300 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.039662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlc8s\" (UniqueName: \"kubernetes.io/projected/c88b02b0-a6f4-4d85-b0c1-8529442d07ca-kube-api-access-jlc8s\") pod \"cluster-monitoring-operator-6d5b84845-mfdgf\" (UID: \"c88b02b0-a6f4-4d85-b0c1-8529442d07ca\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.119676 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" Feb 17 14:12:21 crc kubenswrapper[4762]: I0217 14:12:21.594866 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf"] Feb 17 14:12:22 crc kubenswrapper[4762]: I0217 14:12:22.120203 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" event={"ID":"c88b02b0-a6f4-4d85-b0c1-8529442d07ca","Type":"ContainerStarted","Data":"0f1f5d40158579aa53e69381eb32a7dda660824390ec20f536367bc76cc3f468"} Feb 17 14:12:23 crc kubenswrapper[4762]: I0217 14:12:23.946218 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v"] Feb 17 14:12:23 crc kubenswrapper[4762]: I0217 14:12:23.947340 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:23 crc kubenswrapper[4762]: I0217 14:12:23.948716 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-f86b2" Feb 17 14:12:23 crc kubenswrapper[4762]: I0217 14:12:23.949834 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Feb 17 14:12:23 crc kubenswrapper[4762]: I0217 14:12:23.958425 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v"] Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.064841 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6354819f-2c22-4df6-b8ec-4fb4805e759c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2hv8v\" (UID: \"6354819f-2c22-4df6-b8ec-4fb4805e759c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.130798 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" event={"ID":"c88b02b0-a6f4-4d85-b0c1-8529442d07ca","Type":"ContainerStarted","Data":"fdbe466123368c1d9df799f40d1ee75dd40f62024db07abaa0acc7a33b7db3fd"} Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.144214 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mfdgf" podStartSLOduration=2.360344473 podStartE2EDuration="4.144199453s" podCreationTimestamp="2026-02-17 14:12:20 +0000 UTC" firstStartedPulling="2026-02-17 14:12:21.60523985 +0000 UTC m=+422.185240502" lastFinishedPulling="2026-02-17 14:12:23.38909483 +0000 UTC m=+423.969095482" observedRunningTime="2026-02-17 14:12:24.143217748 +0000 UTC m=+424.723218400" watchObservedRunningTime="2026-02-17 14:12:24.144199453 +0000 UTC m=+424.724200105" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.166327 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6354819f-2c22-4df6-b8ec-4fb4805e759c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2hv8v\" (UID: \"6354819f-2c22-4df6-b8ec-4fb4805e759c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:24 crc kubenswrapper[4762]: E0217 14:12:24.166498 4762 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Feb 17 14:12:24 crc kubenswrapper[4762]: E0217 14:12:24.166575 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6354819f-2c22-4df6-b8ec-4fb4805e759c-tls-certificates podName:6354819f-2c22-4df6-b8ec-4fb4805e759c nodeName:}" failed. No retries permitted until 2026-02-17 14:12:24.666553369 +0000 UTC m=+425.246554021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/6354819f-2c22-4df6-b8ec-4fb4805e759c-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-2hv8v" (UID: "6354819f-2c22-4df6-b8ec-4fb4805e759c") : secret "prometheus-operator-admission-webhook-tls" not found Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.620991 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.621058 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.621103 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.621634 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.621726 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7" gracePeriod=600 Feb 17 14:12:24 crc kubenswrapper[4762]: E0217 14:12:24.664933 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb11ce5_3ff7_4743_a879_95285dae2998.slice/crio-conmon-b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.672771 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6354819f-2c22-4df6-b8ec-4fb4805e759c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2hv8v\" (UID: \"6354819f-2c22-4df6-b8ec-4fb4805e759c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.678311 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6354819f-2c22-4df6-b8ec-4fb4805e759c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2hv8v\" (UID: \"6354819f-2c22-4df6-b8ec-4fb4805e759c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:24 crc kubenswrapper[4762]: I0217 14:12:24.860729 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:25 crc kubenswrapper[4762]: I0217 14:12:25.138903 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7" exitCode=0 Feb 17 14:12:25 crc kubenswrapper[4762]: I0217 14:12:25.139678 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7"} Feb 17 14:12:25 crc kubenswrapper[4762]: I0217 14:12:25.139733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"817296b81932e51cfaf5f5110e46a8a500731db1cf4d8ef393c04d896b5ebe8b"} Feb 17 14:12:25 crc kubenswrapper[4762]: I0217 14:12:25.139787 4762 scope.go:117] "RemoveContainer" containerID="205968b2e597c1dbfc2fa8e563c643ff14d674392f329ebda8d3dd2086317fc5" Feb 17 14:12:25 crc kubenswrapper[4762]: I0217 14:12:25.252337 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v"] Feb 17 14:12:25 crc kubenswrapper[4762]: W0217 14:12:25.254542 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6354819f_2c22_4df6_b8ec_4fb4805e759c.slice/crio-0525ffd1a08a454d5667a5ee895f36aeda3e7c62bea541d34417ea2be8cd1b23 WatchSource:0}: Error finding container 0525ffd1a08a454d5667a5ee895f36aeda3e7c62bea541d34417ea2be8cd1b23: Status 404 returned error can't find the container with id 0525ffd1a08a454d5667a5ee895f36aeda3e7c62bea541d34417ea2be8cd1b23 Feb 17 14:12:26 crc kubenswrapper[4762]: I0217 14:12:26.145989 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" event={"ID":"6354819f-2c22-4df6-b8ec-4fb4805e759c","Type":"ContainerStarted","Data":"0525ffd1a08a454d5667a5ee895f36aeda3e7c62bea541d34417ea2be8cd1b23"} Feb 17 14:12:28 crc kubenswrapper[4762]: I0217 14:12:28.163914 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" event={"ID":"6354819f-2c22-4df6-b8ec-4fb4805e759c","Type":"ContainerStarted","Data":"d2c3fd817dd810fe19c842a313818fea53b99885044fcafdb25c27b1a24763d0"} Feb 17 14:12:28 crc kubenswrapper[4762]: I0217 14:12:28.164448 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:28 crc kubenswrapper[4762]: I0217 14:12:28.168761 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" Feb 17 14:12:28 crc kubenswrapper[4762]: I0217 14:12:28.180032 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2hv8v" podStartSLOduration=2.72827687 podStartE2EDuration="5.180015743s" podCreationTimestamp="2026-02-17 14:12:23 +0000 UTC" firstStartedPulling="2026-02-17 14:12:25.256839756 +0000 UTC m=+425.836840408" lastFinishedPulling="2026-02-17 14:12:27.708578629 +0000 UTC m=+428.288579281" observedRunningTime="2026-02-17 14:12:28.178532256 +0000 UTC m=+428.758532898" watchObservedRunningTime="2026-02-17 14:12:28.180015743 +0000 UTC m=+428.760016405" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.007125 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-n4c4z"] Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.008140 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.009975 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.010382 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.011547 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-dtbrb" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.022941 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-n4c4z"] Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.060043 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.161482 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b15fb60-3f33-42d2-8a86-259b3143e14c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.161554 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b15fb60-3f33-42d2-8a86-259b3143e14c-metrics-client-ca\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.161848 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlc5q\" (UniqueName: \"kubernetes.io/projected/0b15fb60-3f33-42d2-8a86-259b3143e14c-kube-api-access-qlc5q\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.162013 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b15fb60-3f33-42d2-8a86-259b3143e14c-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.263840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlc5q\" (UniqueName: \"kubernetes.io/projected/0b15fb60-3f33-42d2-8a86-259b3143e14c-kube-api-access-qlc5q\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.263947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b15fb60-3f33-42d2-8a86-259b3143e14c-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.264021 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b15fb60-3f33-42d2-8a86-259b3143e14c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.264083 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b15fb60-3f33-42d2-8a86-259b3143e14c-metrics-client-ca\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.265087 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b15fb60-3f33-42d2-8a86-259b3143e14c-metrics-client-ca\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.270473 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b15fb60-3f33-42d2-8a86-259b3143e14c-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.272127 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b15fb60-3f33-42d2-8a86-259b3143e14c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.280508 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlc5q\" (UniqueName: \"kubernetes.io/projected/0b15fb60-3f33-42d2-8a86-259b3143e14c-kube-api-access-qlc5q\") pod \"prometheus-operator-db54df47d-n4c4z\" (UID: \"0b15fb60-3f33-42d2-8a86-259b3143e14c\") " pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.362337 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" Feb 17 14:12:29 crc kubenswrapper[4762]: I0217 14:12:29.823280 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-n4c4z"] Feb 17 14:12:30 crc kubenswrapper[4762]: I0217 14:12:30.175400 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" event={"ID":"0b15fb60-3f33-42d2-8a86-259b3143e14c","Type":"ContainerStarted","Data":"d6a817ad0782c3bb1ebc2d50e586fdea21d65b0e2ccb9c418cebca14f20c55f1"} Feb 17 14:12:32 crc kubenswrapper[4762]: I0217 14:12:32.202371 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" event={"ID":"0b15fb60-3f33-42d2-8a86-259b3143e14c","Type":"ContainerStarted","Data":"cf90f1f68890ed6e3557d0def4124540d851d4a1061e6064f80f622c0fef4e2e"} Feb 17 14:12:32 crc kubenswrapper[4762]: I0217 14:12:32.202414 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" event={"ID":"0b15fb60-3f33-42d2-8a86-259b3143e14c","Type":"ContainerStarted","Data":"62df800ee14b64be2d5141bbe22e19f00be09c9bdc83c4dd3bac3953730f453d"} Feb 17 14:12:32 crc kubenswrapper[4762]: I0217 14:12:32.221428 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-n4c4z" podStartSLOduration=2.35549885 podStartE2EDuration="4.221407751s" podCreationTimestamp="2026-02-17 14:12:28 +0000 UTC" firstStartedPulling="2026-02-17 14:12:29.833495667 +0000 UTC m=+430.413496319" lastFinishedPulling="2026-02-17 14:12:31.699404568 +0000 UTC m=+432.279405220" observedRunningTime="2026-02-17 14:12:32.217209926 +0000 UTC m=+432.797210578" watchObservedRunningTime="2026-02-17 14:12:32.221407751 +0000 UTC m=+432.801408423" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.424849 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff"] Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.426403 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.430907 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn"] Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.431469 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.431631 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.432140 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.432177 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-k77sw" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.432912 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.435606 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.435804 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.435973 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-g54wj" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.437499 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff"] Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.460483 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn"] Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.522282 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-d7hz4"] Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.523604 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.527119 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-fs6wh" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.527193 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.527638 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533321 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfnm\" (UniqueName: \"kubernetes.io/projected/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-kube-api-access-nnfnm\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533416 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533484 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533510 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533576 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dw8h\" (UniqueName: \"kubernetes.io/projected/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-api-access-4dw8h\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533672 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96349f51-aa37-475c-b4f2-2aa495b6bdef-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533690 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/96349f51-aa37-475c-b4f2-2aa495b6bdef-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.533714 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.634838 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635208 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635254 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dw8h\" (UniqueName: \"kubernetes.io/projected/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-api-access-4dw8h\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635285 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635319 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-textfile\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635345 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-root\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-tls\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-wtmp\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635438 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aecaa0a-6718-4401-8393-84526f745355-metrics-client-ca\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635472 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/96349f51-aa37-475c-b4f2-2aa495b6bdef-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96349f51-aa37-475c-b4f2-2aa495b6bdef-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635515 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635538 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635562 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-sys\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635591 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfnm\" (UniqueName: \"kubernetes.io/projected/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-kube-api-access-nnfnm\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: E0217 14:12:34.635691 4762 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Feb 17 14:12:34 crc kubenswrapper[4762]: E0217 14:12:34.635753 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-tls podName:96349f51-aa37-475c-b4f2-2aa495b6bdef nodeName:}" failed. No retries permitted until 2026-02-17 14:12:35.135733412 +0000 UTC m=+435.715734064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-nvcff" (UID: "96349f51-aa37-475c-b4f2-2aa495b6bdef") : secret "kube-state-metrics-tls" not found Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.635976 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.636198 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.636268 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.636390 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/96349f51-aa37-475c-b4f2-2aa495b6bdef-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.636491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vw5r\" (UniqueName: \"kubernetes.io/projected/6aecaa0a-6718-4401-8393-84526f745355-kube-api-access-4vw5r\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.636626 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.636793 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96349f51-aa37-475c-b4f2-2aa495b6bdef-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.641312 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.658359 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.658390 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.661423 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfnm\" (UniqueName: \"kubernetes.io/projected/f088e084-6f3f-4f70-bcb8-53d6bc4cb34b-kube-api-access-nnfnm\") pod \"openshift-state-metrics-566fddb674-w2vzn\" (UID: \"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.663189 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dw8h\" (UniqueName: \"kubernetes.io/projected/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-api-access-4dw8h\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737449 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-sys\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737488 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vw5r\" (UniqueName: \"kubernetes.io/projected/6aecaa0a-6718-4401-8393-84526f745355-kube-api-access-4vw5r\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737535 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-textfile\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-root\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-tls\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737595 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-wtmp\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737594 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-sys\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737813 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aecaa0a-6718-4401-8393-84526f745355-metrics-client-ca\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.737950 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-root\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.738438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-textfile\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.738553 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aecaa0a-6718-4401-8393-84526f745355-metrics-client-ca\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.738607 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-wtmp\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.751973 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.757994 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-tls\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.765336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vw5r\" (UniqueName: \"kubernetes.io/projected/6aecaa0a-6718-4401-8393-84526f745355-kube-api-access-4vw5r\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.767946 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aecaa0a-6718-4401-8393-84526f745355-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d7hz4\" (UID: \"6aecaa0a-6718-4401-8393-84526f745355\") " pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:34 crc kubenswrapper[4762]: I0217 14:12:34.842573 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d7hz4" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.143011 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.144160 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn"] Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.148744 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/96349f51-aa37-475c-b4f2-2aa495b6bdef-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nvcff\" (UID: \"96349f51-aa37-475c-b4f2-2aa495b6bdef\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.218786 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d7hz4" event={"ID":"6aecaa0a-6718-4401-8393-84526f745355","Type":"ContainerStarted","Data":"07df97f3c4fb03b2f1475f4b393fa157071bc578ad3dcf11d91534ada547fbd2"} Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.220011 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" event={"ID":"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b","Type":"ContainerStarted","Data":"10b8521ba13c97a681b70db897a1de79288b3be9b7c77bb2e419fde0a74c075c"} Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.342668 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.486753 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.489234 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497151 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497312 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497382 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497216 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497418 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497750 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-mshgh" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497793 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.497848 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.499813 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.523845 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.553985 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ff4c832c-bd71-458c-ab27-0119e342986c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c467\" (UniqueName: \"kubernetes.io/projected/ff4c832c-bd71-458c-ab27-0119e342986c-kube-api-access-8c467\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554098 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff4c832c-bd71-458c-ab27-0119e342986c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554134 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554222 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554253 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554286 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-web-config\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554307 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ff4c832c-bd71-458c-ab27-0119e342986c-config-out\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554333 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-config-volume\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554362 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ff4c832c-bd71-458c-ab27-0119e342986c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554386 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4c832c-bd71-458c-ab27-0119e342986c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.554413 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656126 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656172 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656206 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-web-config\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656227 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ff4c832c-bd71-458c-ab27-0119e342986c-config-out\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656243 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-config-volume\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656268 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ff4c832c-bd71-458c-ab27-0119e342986c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656289 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4c832c-bd71-458c-ab27-0119e342986c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656306 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656333 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ff4c832c-bd71-458c-ab27-0119e342986c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656354 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c467\" (UniqueName: \"kubernetes.io/projected/ff4c832c-bd71-458c-ab27-0119e342986c-kube-api-access-8c467\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656384 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff4c832c-bd71-458c-ab27-0119e342986c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.656403 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.657750 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ff4c832c-bd71-458c-ab27-0119e342986c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.659740 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff4c832c-bd71-458c-ab27-0119e342986c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.661233 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4c832c-bd71-458c-ab27-0119e342986c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.664974 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.667044 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff"] Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.667154 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.667636 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.668466 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.669136 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-web-config\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.673967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ff4c832c-bd71-458c-ab27-0119e342986c-config-out\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.675084 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ff4c832c-bd71-458c-ab27-0119e342986c-config-volume\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.676560 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ff4c832c-bd71-458c-ab27-0119e342986c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.680981 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c467\" (UniqueName: \"kubernetes.io/projected/ff4c832c-bd71-458c-ab27-0119e342986c-kube-api-access-8c467\") pod \"alertmanager-main-0\" (UID: \"ff4c832c-bd71-458c-ab27-0119e342986c\") " pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:35 crc kubenswrapper[4762]: I0217 14:12:35.816361 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.250852 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" event={"ID":"96349f51-aa37-475c-b4f2-2aa495b6bdef","Type":"ContainerStarted","Data":"91b31e7113c529e96e75c15ca7674e413d8f8260fb079b49a124608d0ada51c6"} Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.254223 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" event={"ID":"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b","Type":"ContainerStarted","Data":"2eafc6b516c2d7f8978e0c915489ad78b201ea185d2448ab17c3e83d0ebd5da0"} Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.254254 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" event={"ID":"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b","Type":"ContainerStarted","Data":"3e2d8bd1b0a02e3446777649de9b806bc830acf75e23c69d91a5d6a00dca92d9"} Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.549544 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6db88d458f-nd42s"] Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.551472 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.553829 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.554079 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.554210 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.555051 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.555830 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-5546r" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.557578 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-2iov2dl6e295s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.562913 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.579977 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6db88d458f-nd42s"] Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.589631 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 17 14:12:36 crc kubenswrapper[4762]: W0217 14:12:36.661525 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4c832c_bd71_458c_ab27_0119e342986c.slice/crio-158e970b0ffd980a568e5b6f20303c8f8be944dd7d5f3919af3454aa62b7ed10 WatchSource:0}: Error finding container 158e970b0ffd980a568e5b6f20303c8f8be944dd7d5f3919af3454aa62b7ed10: Status 404 returned error can't find the container with id 158e970b0ffd980a568e5b6f20303c8f8be944dd7d5f3919af3454aa62b7ed10 Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698403 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698441 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-tls\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698512 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-grpc-tls\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698539 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba3c53b4-fea6-4c10-af28-1461348ffbd1-metrics-client-ca\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.698578 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6hfk\" (UniqueName: \"kubernetes.io/projected/ba3c53b4-fea6-4c10-af28-1461348ffbd1-kube-api-access-m6hfk\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800003 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800098 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800128 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-tls\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800172 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-grpc-tls\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800201 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba3c53b4-fea6-4c10-af28-1461348ffbd1-metrics-client-ca\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800237 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6hfk\" (UniqueName: \"kubernetes.io/projected/ba3c53b4-fea6-4c10-af28-1461348ffbd1-kube-api-access-m6hfk\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.800267 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.802112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba3c53b4-fea6-4c10-af28-1461348ffbd1-metrics-client-ca\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.805954 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.806137 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-grpc-tls\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.807691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-tls\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.810427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.813145 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.817701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ba3c53b4-fea6-4c10-af28-1461348ffbd1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.820423 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6hfk\" (UniqueName: \"kubernetes.io/projected/ba3c53b4-fea6-4c10-af28-1461348ffbd1-kube-api-access-m6hfk\") pod \"thanos-querier-6db88d458f-nd42s\" (UID: \"ba3c53b4-fea6-4c10-af28-1461348ffbd1\") " pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:36 crc kubenswrapper[4762]: I0217 14:12:36.877736 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:37 crc kubenswrapper[4762]: I0217 14:12:37.264945 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"158e970b0ffd980a568e5b6f20303c8f8be944dd7d5f3919af3454aa62b7ed10"} Feb 17 14:12:37 crc kubenswrapper[4762]: I0217 14:12:37.267043 4762 generic.go:334] "Generic (PLEG): container finished" podID="6aecaa0a-6718-4401-8393-84526f745355" containerID="41698489cf5a43f4041534976a74fba183211de3d6e71598899cc0cc22f55e5b" exitCode=0 Feb 17 14:12:37 crc kubenswrapper[4762]: I0217 14:12:37.267107 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d7hz4" event={"ID":"6aecaa0a-6718-4401-8393-84526f745355","Type":"ContainerDied","Data":"41698489cf5a43f4041534976a74fba183211de3d6e71598899cc0cc22f55e5b"} Feb 17 14:12:37 crc kubenswrapper[4762]: I0217 14:12:37.301966 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6db88d458f-nd42s"] Feb 17 14:12:37 crc kubenswrapper[4762]: W0217 14:12:37.879830 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3c53b4_fea6_4c10_af28_1461348ffbd1.slice/crio-b291e0b6a4496c60d25a4ca5f73cb00c34f90362af6312e24167e3a920538059 WatchSource:0}: Error finding container b291e0b6a4496c60d25a4ca5f73cb00c34f90362af6312e24167e3a920538059: Status 404 returned error can't find the container with id b291e0b6a4496c60d25a4ca5f73cb00c34f90362af6312e24167e3a920538059 Feb 17 14:12:38 crc kubenswrapper[4762]: I0217 14:12:38.343323 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"b291e0b6a4496c60d25a4ca5f73cb00c34f90362af6312e24167e3a920538059"} Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.290276 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cb59b7fc9-c5ld6"] Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.291204 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.301673 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cb59b7fc9-c5ld6"] Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.454544 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8v5\" (UniqueName: \"kubernetes.io/projected/090e1d23-2437-4cd0-97bd-39cd0a0b070b-kube-api-access-jc8v5\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.454972 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-config\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.455002 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-trusted-ca-bundle\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.455120 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-serving-cert\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.455278 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-service-ca\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.455312 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-oauth-serving-cert\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.455340 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-oauth-config\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.556909 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-service-ca\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.624011 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-oauth-serving-cert\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.624231 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-oauth-config\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.625701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-service-ca\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.626072 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8v5\" (UniqueName: \"kubernetes.io/projected/090e1d23-2437-4cd0-97bd-39cd0a0b070b-kube-api-access-jc8v5\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.626158 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-config\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.626226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-trusted-ca-bundle\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.626390 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-serving-cert\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.628391 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-oauth-serving-cert\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.628552 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-config\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.631056 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-serving-cert\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.636709 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-trusted-ca-bundle\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.643785 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-oauth-config\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.650174 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8v5\" (UniqueName: \"kubernetes.io/projected/090e1d23-2437-4cd0-97bd-39cd0a0b070b-kube-api-access-jc8v5\") pod \"console-5cb59b7fc9-c5ld6\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.790022 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6544759b79-fvggd"] Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.792673 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.795894 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.795903 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-9rd62u6q8g8r2" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.796149 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.796289 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.796161 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-5xptn" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.796478 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.843301 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpqq\" (UniqueName: \"kubernetes.io/projected/db7e8c46-733c-49cf-8970-246ddf547747-kube-api-access-pkpqq\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.843893 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/db7e8c46-733c-49cf-8970-246ddf547747-metrics-server-audit-profiles\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.844176 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7e8c46-733c-49cf-8970-246ddf547747-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.844414 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-secret-metrics-client-certs\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.844613 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/db7e8c46-733c-49cf-8970-246ddf547747-audit-log\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.844954 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-client-ca-bundle\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.845211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-secret-metrics-server-tls\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.908720 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946657 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-client-ca-bundle\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946741 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-secret-metrics-server-tls\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946770 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpqq\" (UniqueName: \"kubernetes.io/projected/db7e8c46-733c-49cf-8970-246ddf547747-kube-api-access-pkpqq\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/db7e8c46-733c-49cf-8970-246ddf547747-metrics-server-audit-profiles\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946821 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7e8c46-733c-49cf-8970-246ddf547747-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946847 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-secret-metrics-client-certs\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.946863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/db7e8c46-733c-49cf-8970-246ddf547747-audit-log\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.947847 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/db7e8c46-733c-49cf-8970-246ddf547747-audit-log\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.948349 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7e8c46-733c-49cf-8970-246ddf547747-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.948790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/db7e8c46-733c-49cf-8970-246ddf547747-metrics-server-audit-profiles\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.951776 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-secret-metrics-client-certs\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.952400 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-secret-metrics-server-tls\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.952433 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7e8c46-733c-49cf-8970-246ddf547747-client-ca-bundle\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:39 crc kubenswrapper[4762]: I0217 14:12:39.974705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpqq\" (UniqueName: \"kubernetes.io/projected/db7e8c46-733c-49cf-8970-246ddf547747-kube-api-access-pkpqq\") pod \"metrics-server-6544759b79-fvggd\" (UID: \"db7e8c46-733c-49cf-8970-246ddf547747\") " pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.138274 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6544759b79-fvggd"] Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.380024 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" event={"ID":"96349f51-aa37-475c-b4f2-2aa495b6bdef","Type":"ContainerStarted","Data":"a74827356d706606f49640700a1b392d6174c1dc49834cd30205ddb88bbdef1c"} Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.380091 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" event={"ID":"96349f51-aa37-475c-b4f2-2aa495b6bdef","Type":"ContainerStarted","Data":"ea1ebc8cb981b48c6c8d2d5f4ad8c874f145b3f11258e6c4a75ad39ee37ad544"} Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.382090 4762 generic.go:334] "Generic (PLEG): container finished" podID="ff4c832c-bd71-458c-ab27-0119e342986c" containerID="a8065ac1aacc676e4f737afbc4dfd934d5c4ff436aae614593b554fa5a073b62" exitCode=0 Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.382151 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerDied","Data":"a8065ac1aacc676e4f737afbc4dfd934d5c4ff436aae614593b554fa5a073b62"} Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.394634 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d7hz4" event={"ID":"6aecaa0a-6718-4401-8393-84526f745355","Type":"ContainerStarted","Data":"f9badd017bcce069cc35739109c6a7c8cdfe5978cbbf9bf2cbefa092e6707fa3"} Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.394738 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d7hz4" event={"ID":"6aecaa0a-6718-4401-8393-84526f745355","Type":"ContainerStarted","Data":"224f14ea35f9f950d3406fdeafb587dbd5d872bf6984151914cf062a67883f93"} Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.405759 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" event={"ID":"f088e084-6f3f-4f70-bcb8-53d6bc4cb34b","Type":"ContainerStarted","Data":"a357bf308232cbd7bc8a4be381a90c1f40ad7bee1cd4f1f99d60ffbd06d3cbb3"} Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.458598 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-d7hz4" podStartSLOduration=4.633464814 podStartE2EDuration="6.45857795s" podCreationTimestamp="2026-02-17 14:12:34 +0000 UTC" firstStartedPulling="2026-02-17 14:12:34.881676854 +0000 UTC m=+435.461677506" lastFinishedPulling="2026-02-17 14:12:36.70678999 +0000 UTC m=+437.286790642" observedRunningTime="2026-02-17 14:12:40.449667738 +0000 UTC m=+441.029668410" watchObservedRunningTime="2026-02-17 14:12:40.45857795 +0000 UTC m=+441.038578602" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.475598 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-w2vzn" podStartSLOduration=2.5578156930000002 podStartE2EDuration="6.475386968s" podCreationTimestamp="2026-02-17 14:12:34 +0000 UTC" firstStartedPulling="2026-02-17 14:12:35.413773408 +0000 UTC m=+435.993774060" lastFinishedPulling="2026-02-17 14:12:39.331344683 +0000 UTC m=+439.911345335" observedRunningTime="2026-02-17 14:12:40.473297096 +0000 UTC m=+441.053297758" watchObservedRunningTime="2026-02-17 14:12:40.475386968 +0000 UTC m=+441.055387620" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.525928 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs"] Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.527887 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.533922 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.535354 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.570206 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.683394 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0317b822-b962-4a34-927b-5440573a6afb-monitoring-plugin-cert\") pod \"monitoring-plugin-67fdfc84c4-m26bs\" (UID: \"0317b822-b962-4a34-927b-5440573a6afb\") " pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.719995 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs"] Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.786369 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0317b822-b962-4a34-927b-5440573a6afb-monitoring-plugin-cert\") pod \"monitoring-plugin-67fdfc84c4-m26bs\" (UID: \"0317b822-b962-4a34-927b-5440573a6afb\") " pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.792565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0317b822-b962-4a34-927b-5440573a6afb-monitoring-plugin-cert\") pod \"monitoring-plugin-67fdfc84c4-m26bs\" (UID: \"0317b822-b962-4a34-927b-5440573a6afb\") " pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:40 crc kubenswrapper[4762]: I0217 14:12:40.882728 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:41 crc kubenswrapper[4762]: I0217 14:12:41.094890 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cb59b7fc9-c5ld6"] Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.622123 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" event={"ID":"96349f51-aa37-475c-b4f2-2aa495b6bdef","Type":"ContainerStarted","Data":"bf21153f00e1b65f4398f6f86f55a4cf869ab875b3177af9456bb51345f6d783"} Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.638947 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cb59b7fc9-c5ld6" event={"ID":"090e1d23-2437-4cd0-97bd-39cd0a0b070b","Type":"ContainerStarted","Data":"f5d25532ede6ae0a6c9b418a5369c9f9e202e9a888ea01b4cb3a8256d710235c"} Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.639035 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.641433 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.645586 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646017 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646112 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646238 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646362 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646471 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646609 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646685 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646840 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-942ms" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.646951 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.647058 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-bnern8p5fnpq3" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.650243 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.657068 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.658571 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.659926 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nvcff" podStartSLOduration=3.992338796 podStartE2EDuration="7.65991146s" podCreationTimestamp="2026-02-17 14:12:34 +0000 UTC" firstStartedPulling="2026-02-17 14:12:35.664564269 +0000 UTC m=+436.244564921" lastFinishedPulling="2026-02-17 14:12:39.332136933 +0000 UTC m=+439.912137585" observedRunningTime="2026-02-17 14:12:41.657811927 +0000 UTC m=+442.237812569" watchObservedRunningTime="2026-02-17 14:12:41.65991146 +0000 UTC m=+442.239912112" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.673946 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-config\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.673998 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674076 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-web-config\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674098 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674147 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674174 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674203 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674225 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674248 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674268 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674330 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674355 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674376 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48fa6888-f9c7-420f-adac-1d7ec337a495-config-out\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48fa6888-f9c7-420f-adac-1d7ec337a495-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674583 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzv8\" (UniqueName: \"kubernetes.io/projected/48fa6888-f9c7-420f-adac-1d7ec337a495-kube-api-access-6nzv8\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.674604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.775617 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.775708 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.775745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48fa6888-f9c7-420f-adac-1d7ec337a495-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.775769 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzv8\" (UniqueName: \"kubernetes.io/projected/48fa6888-f9c7-420f-adac-1d7ec337a495-kube-api-access-6nzv8\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.775790 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.775810 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-config\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-web-config\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777790 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777822 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777877 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777917 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777941 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.777993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.778014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48fa6888-f9c7-420f-adac-1d7ec337a495-config-out\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.786076 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48fa6888-f9c7-420f-adac-1d7ec337a495-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.786927 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.787191 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.787418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.788144 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.790376 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.791977 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.792923 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48fa6888-f9c7-420f-adac-1d7ec337a495-config-out\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.794184 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.794848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.797048 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.797426 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.797820 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48fa6888-f9c7-420f-adac-1d7ec337a495-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.797848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.798464 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-web-config\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.799809 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.807930 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzv8\" (UniqueName: \"kubernetes.io/projected/48fa6888-f9c7-420f-adac-1d7ec337a495-kube-api-access-6nzv8\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.808467 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa6888-f9c7-420f-adac-1d7ec337a495-config\") pod \"prometheus-k8s-0\" (UID: \"48fa6888-f9c7-420f-adac-1d7ec337a495\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:41.981958 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:42.631858 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs"] Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:42.635986 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6544759b79-fvggd"] Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:42.644841 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cb59b7fc9-c5ld6" event={"ID":"090e1d23-2437-4cd0-97bd-39cd0a0b070b","Type":"ContainerStarted","Data":"aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10"} Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:42.652610 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 17 14:12:42 crc kubenswrapper[4762]: I0217 14:12:42.666438 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cb59b7fc9-c5ld6" podStartSLOduration=3.666421742 podStartE2EDuration="3.666421742s" podCreationTimestamp="2026-02-17 14:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:12:42.661078099 +0000 UTC m=+443.241078751" watchObservedRunningTime="2026-02-17 14:12:42.666421742 +0000 UTC m=+443.246422394" Feb 17 14:12:44 crc kubenswrapper[4762]: W0217 14:12:44.426058 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0317b822_b962_4a34_927b_5440573a6afb.slice/crio-ac578d3d3851ee77acc2b20316a88651f3b4cca23380a18d3add9a2134777f51 WatchSource:0}: Error finding container ac578d3d3851ee77acc2b20316a88651f3b4cca23380a18d3add9a2134777f51: Status 404 returned error can't find the container with id ac578d3d3851ee77acc2b20316a88651f3b4cca23380a18d3add9a2134777f51 Feb 17 14:12:44 crc kubenswrapper[4762]: I0217 14:12:44.715099 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"1c6cd82964b963a84e2ff00192b951a19384f2b59ccb488752a1c26dbcca3dd2"} Feb 17 14:12:44 crc kubenswrapper[4762]: I0217 14:12:44.716617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" event={"ID":"db7e8c46-733c-49cf-8970-246ddf547747","Type":"ContainerStarted","Data":"e91ac6a810a82f9d30a53c6ca15ba0cd188229517ef4c75b4f8551266be75662"} Feb 17 14:12:44 crc kubenswrapper[4762]: I0217 14:12:44.718922 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" event={"ID":"0317b822-b962-4a34-927b-5440573a6afb","Type":"ContainerStarted","Data":"ac578d3d3851ee77acc2b20316a88651f3b4cca23380a18d3add9a2134777f51"} Feb 17 14:12:44 crc kubenswrapper[4762]: I0217 14:12:44.720114 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"8a9a679e39a1a53428a237c4b00fade4988382a778a1892197e62f6a752c93de"} Feb 17 14:12:44 crc kubenswrapper[4762]: I0217 14:12:44.720216 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"414af0f0be1b49299afa3088febf8214dc37b8d3ae6e84cda9e6fe24a79b5e96"} Feb 17 14:12:44 crc kubenswrapper[4762]: I0217 14:12:44.722791 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"762e3ff78db9ed35730fbe03e3b543bfdf26ed735d187eb2899585e7568fe325"} Feb 17 14:12:44 crc kubenswrapper[4762]: E0217 14:12:44.948894 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fa6888_f9c7_420f_adac_1d7ec337a495.slice/crio-conmon-8a9a679e39a1a53428a237c4b00fade4988382a778a1892197e62f6a752c93de.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.730070 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"0d519e96584353e2048f9d40eb420c598d69ffd63f05a40aecf8743fdcea6b4c"} Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.730517 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"d452936a77271a656d57d840f87a274935cfd05c8fb3030840e7a7956969fc1b"} Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.730533 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"0ed0dd53128a435d8c40c6938628653bbc4c7dfcb2522c8bb5e05cbeda1a8cc8"} Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.731836 4762 generic.go:334] "Generic (PLEG): container finished" podID="48fa6888-f9c7-420f-adac-1d7ec337a495" containerID="8a9a679e39a1a53428a237c4b00fade4988382a778a1892197e62f6a752c93de" exitCode=0 Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.731880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerDied","Data":"8a9a679e39a1a53428a237c4b00fade4988382a778a1892197e62f6a752c93de"} Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.737363 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"74de4ba5a0953b93eb9d3325412e206a113c0d37473fc612fd4ee18956735a05"} Feb 17 14:12:45 crc kubenswrapper[4762]: I0217 14:12:45.737401 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"8c754930215bf0a0ccbc5d3f337d0e2b292013f04acfee3dce1479045e67501e"} Feb 17 14:12:46 crc kubenswrapper[4762]: I0217 14:12:46.745811 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"bcc5d83a4f05d97e5e8eb45c28e0d3e8fa6ca463bb480ced7b0c98d923f86a88"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.928557 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"559a5052dff8e84272c4470856f0fbc953cf3a120fee4b662ab6279c1ac685c7"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.928935 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.928952 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"8f0b832eec6e6a31ac65531df2e2b2bbd30e862a01a30588a76408965960cb4d"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.928968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" event={"ID":"ba3c53b4-fea6-4c10-af28-1461348ffbd1","Type":"ContainerStarted","Data":"c2e629073e8f4f94d55b9ddbdb5c55261e305ea0f45c7fa0864f0b10bcd0719a"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.933384 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ff4c832c-bd71-458c-ab27-0119e342986c","Type":"ContainerStarted","Data":"bcaa6dadcf74764a64d420ea707c9ade2b4c3d670e803db4e50709351a81b694"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.935874 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" event={"ID":"db7e8c46-733c-49cf-8970-246ddf547747","Type":"ContainerStarted","Data":"ba37eca93806fdec946d0ee9776b836606fe304ed745b299add29dd0c8282479"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.938130 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" event={"ID":"0317b822-b962-4a34-927b-5440573a6afb","Type":"ContainerStarted","Data":"511a07e6520b35e97cfa1b9421e234672efe3b6639457daf60e9039dad50dc1f"} Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.938338 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.944379 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.952794 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" podStartSLOduration=2.442535801 podStartE2EDuration="12.952777216s" podCreationTimestamp="2026-02-17 14:12:36 +0000 UTC" firstStartedPulling="2026-02-17 14:12:37.881700703 +0000 UTC m=+438.461701355" lastFinishedPulling="2026-02-17 14:12:48.391942118 +0000 UTC m=+448.971942770" observedRunningTime="2026-02-17 14:12:48.952367635 +0000 UTC m=+449.532368307" watchObservedRunningTime="2026-02-17 14:12:48.952777216 +0000 UTC m=+449.532777858" Feb 17 14:12:48 crc kubenswrapper[4762]: I0217 14:12:48.972083 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-67fdfc84c4-m26bs" podStartSLOduration=5.026736638 podStartE2EDuration="8.972060725s" podCreationTimestamp="2026-02-17 14:12:40 +0000 UTC" firstStartedPulling="2026-02-17 14:12:44.44661679 +0000 UTC m=+445.026617432" lastFinishedPulling="2026-02-17 14:12:48.391940867 +0000 UTC m=+448.971941519" observedRunningTime="2026-02-17 14:12:48.96583341 +0000 UTC m=+449.545834082" watchObservedRunningTime="2026-02-17 14:12:48.972060725 +0000 UTC m=+449.552061397" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.024904 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.291118873 podStartE2EDuration="14.02488609s" podCreationTimestamp="2026-02-17 14:12:35 +0000 UTC" firstStartedPulling="2026-02-17 14:12:36.663597515 +0000 UTC m=+437.243598167" lastFinishedPulling="2026-02-17 14:12:48.397364722 +0000 UTC m=+448.977365384" observedRunningTime="2026-02-17 14:12:49.016807479 +0000 UTC m=+449.596808151" watchObservedRunningTime="2026-02-17 14:12:49.02488609 +0000 UTC m=+449.604886742" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.033358 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" podStartSLOduration=6.062237152 podStartE2EDuration="10.033340271s" podCreationTimestamp="2026-02-17 14:12:39 +0000 UTC" firstStartedPulling="2026-02-17 14:12:44.421175897 +0000 UTC m=+445.001176549" lastFinishedPulling="2026-02-17 14:12:48.392279016 +0000 UTC m=+448.972279668" observedRunningTime="2026-02-17 14:12:49.032361916 +0000 UTC m=+449.612362568" watchObservedRunningTime="2026-02-17 14:12:49.033340271 +0000 UTC m=+449.613340923" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.909264 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.909790 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.915342 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.958564 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:12:49 crc kubenswrapper[4762]: I0217 14:12:49.998448 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6db88d458f-nd42s" Feb 17 14:12:50 crc kubenswrapper[4762]: I0217 14:12:50.045719 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-54mm8"] Feb 17 14:12:52 crc kubenswrapper[4762]: I0217 14:12:52.967311 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"bfdb7e7524cc2c4cb5717a268acf8ee386b4ff74f994da72898871773c4e870c"} Feb 17 14:12:53 crc kubenswrapper[4762]: I0217 14:12:53.999418 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"0cc609d0488bc99d208e7bb0315a05c87ac2df45e58aaa33e2c4913c49def795"} Feb 17 14:12:53 crc kubenswrapper[4762]: I0217 14:12:53.999488 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"00ca2acce73f83c0b9a39ff9db4be7ad7922d2f36c746cea9edb26aaa889f7d2"} Feb 17 14:12:53 crc kubenswrapper[4762]: I0217 14:12:53.999518 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"2520a82ccc57bdabc6e9bad18c08853ddec06bb2289d6afeb0d62145efa9f7a4"} Feb 17 14:12:55 crc kubenswrapper[4762]: I0217 14:12:55.011256 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"b53e094ea0b670dfc9a3848da12cecc7b21f57e454c2c73a4a0055733bdffe21"} Feb 17 14:12:55 crc kubenswrapper[4762]: I0217 14:12:55.011600 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"48fa6888-f9c7-420f-adac-1d7ec337a495","Type":"ContainerStarted","Data":"bb349ccf79f18c0f2152ac99673f5e45969ba5de756d1c359b1ed5dd9db50cad"} Feb 17 14:12:55 crc kubenswrapper[4762]: I0217 14:12:55.061235 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=7.055042822 podStartE2EDuration="14.061218992s" podCreationTimestamp="2026-02-17 14:12:41 +0000 UTC" firstStartedPulling="2026-02-17 14:12:45.733604752 +0000 UTC m=+446.313605404" lastFinishedPulling="2026-02-17 14:12:52.739780932 +0000 UTC m=+453.319781574" observedRunningTime="2026-02-17 14:12:55.058361171 +0000 UTC m=+455.638361843" watchObservedRunningTime="2026-02-17 14:12:55.061218992 +0000 UTC m=+455.641219644" Feb 17 14:12:56 crc kubenswrapper[4762]: I0217 14:12:56.982780 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:13:00 crc kubenswrapper[4762]: I0217 14:13:00.571797 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:13:00 crc kubenswrapper[4762]: I0217 14:13:00.571896 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.096322 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-54mm8" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerName="console" containerID="cri-o://9e696a6f7238329a5d4bccd348be6fc2d7bbdeadbcbf8c2bac2f016c90c416e1" gracePeriod=15 Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.365875 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-54mm8_151149d5-152a-49f8-8c5f-453e68dc4bf5/console/0.log" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.365933 4762 generic.go:334] "Generic (PLEG): container finished" podID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerID="9e696a6f7238329a5d4bccd348be6fc2d7bbdeadbcbf8c2bac2f016c90c416e1" exitCode=2 Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.365965 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-54mm8" event={"ID":"151149d5-152a-49f8-8c5f-453e68dc4bf5","Type":"ContainerDied","Data":"9e696a6f7238329a5d4bccd348be6fc2d7bbdeadbcbf8c2bac2f016c90c416e1"} Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.471084 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-54mm8_151149d5-152a-49f8-8c5f-453e68dc4bf5/console/0.log" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.471163 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593046 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-oauth-serving-cert\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593410 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-serving-cert\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7zb2\" (UniqueName: \"kubernetes.io/projected/151149d5-152a-49f8-8c5f-453e68dc4bf5-kube-api-access-g7zb2\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593503 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-trusted-ca-bundle\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593562 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-config\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-oauth-config\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593772 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-service-ca\") pod \"151149d5-152a-49f8-8c5f-453e68dc4bf5\" (UID: \"151149d5-152a-49f8-8c5f-453e68dc4bf5\") " Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.593797 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.594191 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-config" (OuterVolumeSpecName: "console-config") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.594183 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.594607 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-service-ca" (OuterVolumeSpecName: "service-ca") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.594644 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.601476 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.601717 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151149d5-152a-49f8-8c5f-453e68dc4bf5-kube-api-access-g7zb2" (OuterVolumeSpecName: "kube-api-access-g7zb2") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "kube-api-access-g7zb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.601845 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "151149d5-152a-49f8-8c5f-453e68dc4bf5" (UID: "151149d5-152a-49f8-8c5f-453e68dc4bf5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.695551 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.695579 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7zb2\" (UniqueName: \"kubernetes.io/projected/151149d5-152a-49f8-8c5f-453e68dc4bf5-kube-api-access-g7zb2\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.695589 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.695598 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.695605 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/151149d5-152a-49f8-8c5f-453e68dc4bf5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:15 crc kubenswrapper[4762]: I0217 14:13:15.695613 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/151149d5-152a-49f8-8c5f-453e68dc4bf5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:16 crc kubenswrapper[4762]: I0217 14:13:16.375753 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-54mm8_151149d5-152a-49f8-8c5f-453e68dc4bf5/console/0.log" Feb 17 14:13:16 crc kubenswrapper[4762]: I0217 14:13:16.375877 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-54mm8" event={"ID":"151149d5-152a-49f8-8c5f-453e68dc4bf5","Type":"ContainerDied","Data":"c6aad3bb942412eed53be77e9ea8cd21deecfc1a2f77ab31f6dd3298a48fe5a7"} Feb 17 14:13:16 crc kubenswrapper[4762]: I0217 14:13:16.375967 4762 scope.go:117] "RemoveContainer" containerID="9e696a6f7238329a5d4bccd348be6fc2d7bbdeadbcbf8c2bac2f016c90c416e1" Feb 17 14:13:16 crc kubenswrapper[4762]: I0217 14:13:16.375988 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-54mm8" Feb 17 14:13:16 crc kubenswrapper[4762]: I0217 14:13:16.402471 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-54mm8"] Feb 17 14:13:16 crc kubenswrapper[4762]: I0217 14:13:16.409848 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-54mm8"] Feb 17 14:13:18 crc kubenswrapper[4762]: I0217 14:13:18.079022 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" path="/var/lib/kubelet/pods/151149d5-152a-49f8-8c5f-453e68dc4bf5/volumes" Feb 17 14:13:20 crc kubenswrapper[4762]: I0217 14:13:20.579256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:13:20 crc kubenswrapper[4762]: I0217 14:13:20.584468 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6544759b79-fvggd" Feb 17 14:13:41 crc kubenswrapper[4762]: I0217 14:13:41.982371 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:13:42 crc kubenswrapper[4762]: I0217 14:13:42.010154 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:13:42 crc kubenswrapper[4762]: I0217 14:13:42.578979 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.024559 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86c5f45bcb-954rj"] Feb 17 14:13:59 crc kubenswrapper[4762]: E0217 14:13:59.026529 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerName="console" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.026549 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerName="console" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.026703 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="151149d5-152a-49f8-8c5f-453e68dc4bf5" containerName="console" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.027104 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.040721 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c5f45bcb-954rj"] Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.217863 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-oauth-config\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.217910 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-config\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.217938 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-service-ca\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.217969 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6s2\" (UniqueName: \"kubernetes.io/projected/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-kube-api-access-bz6s2\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.218109 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-serving-cert\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.218174 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-trusted-ca-bundle\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.218203 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-oauth-serving-cert\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.319536 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-oauth-config\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.319621 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-config\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.319703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-service-ca\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.319770 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6s2\" (UniqueName: \"kubernetes.io/projected/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-kube-api-access-bz6s2\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.319895 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-serving-cert\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.319965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-trusted-ca-bundle\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.320028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-oauth-serving-cert\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.322295 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-service-ca\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.322613 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-oauth-serving-cert\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.323318 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-trusted-ca-bundle\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.324042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-config\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.328260 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-serving-cert\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.328957 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-oauth-config\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.339055 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6s2\" (UniqueName: \"kubernetes.io/projected/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-kube-api-access-bz6s2\") pod \"console-86c5f45bcb-954rj\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.352019 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:13:59 crc kubenswrapper[4762]: I0217 14:13:59.890590 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c5f45bcb-954rj"] Feb 17 14:14:00 crc kubenswrapper[4762]: I0217 14:14:00.677366 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c5f45bcb-954rj" event={"ID":"36ae5bb3-63ce-4c9e-a891-c83b6ff22576","Type":"ContainerStarted","Data":"e2a227c620335e07b393b55093cee34504975bfcf2184304a2b4a6d8f1adcc33"} Feb 17 14:14:00 crc kubenswrapper[4762]: I0217 14:14:00.677417 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c5f45bcb-954rj" event={"ID":"36ae5bb3-63ce-4c9e-a891-c83b6ff22576","Type":"ContainerStarted","Data":"69c080d6e7ce862c43827c5762e2241dbb82a1455b0e858be45d7c62cfe62c6b"} Feb 17 14:14:00 crc kubenswrapper[4762]: I0217 14:14:00.695141 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86c5f45bcb-954rj" podStartSLOduration=2.695102812 podStartE2EDuration="2.695102812s" podCreationTimestamp="2026-02-17 14:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:14:00.694934267 +0000 UTC m=+521.274934939" watchObservedRunningTime="2026-02-17 14:14:00.695102812 +0000 UTC m=+521.275103464" Feb 17 14:14:09 crc kubenswrapper[4762]: I0217 14:14:09.352190 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:14:09 crc kubenswrapper[4762]: I0217 14:14:09.352735 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:14:09 crc kubenswrapper[4762]: I0217 14:14:09.357484 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:14:09 crc kubenswrapper[4762]: I0217 14:14:09.738034 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:14:09 crc kubenswrapper[4762]: I0217 14:14:09.797979 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cb59b7fc9-c5ld6"] Feb 17 14:14:24 crc kubenswrapper[4762]: I0217 14:14:24.621419 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:14:24 crc kubenswrapper[4762]: I0217 14:14:24.621982 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:14:34 crc kubenswrapper[4762]: I0217 14:14:34.849420 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5cb59b7fc9-c5ld6" podUID="090e1d23-2437-4cd0-97bd-39cd0a0b070b" containerName="console" containerID="cri-o://aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10" gracePeriod=15 Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.197956 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cb59b7fc9-c5ld6_090e1d23-2437-4cd0-97bd-39cd0a0b070b/console/0.log" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.198219 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266022 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8v5\" (UniqueName: \"kubernetes.io/projected/090e1d23-2437-4cd0-97bd-39cd0a0b070b-kube-api-access-jc8v5\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266076 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-oauth-config\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266114 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-service-ca\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266166 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-config\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266204 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-serving-cert\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266274 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-oauth-serving-cert\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.266328 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-trusted-ca-bundle\") pod \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\" (UID: \"090e1d23-2437-4cd0-97bd-39cd0a0b070b\") " Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.267446 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-service-ca" (OuterVolumeSpecName: "service-ca") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.267459 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-config" (OuterVolumeSpecName: "console-config") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.267775 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.268707 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.274870 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.274876 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090e1d23-2437-4cd0-97bd-39cd0a0b070b-kube-api-access-jc8v5" (OuterVolumeSpecName: "kube-api-access-jc8v5") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "kube-api-access-jc8v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.278823 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "090e1d23-2437-4cd0-97bd-39cd0a0b070b" (UID: "090e1d23-2437-4cd0-97bd-39cd0a0b070b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368218 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8v5\" (UniqueName: \"kubernetes.io/projected/090e1d23-2437-4cd0-97bd-39cd0a0b070b-kube-api-access-jc8v5\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368245 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368255 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368263 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368271 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090e1d23-2437-4cd0-97bd-39cd0a0b070b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368279 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.368290 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090e1d23-2437-4cd0-97bd-39cd0a0b070b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.918019 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cb59b7fc9-c5ld6_090e1d23-2437-4cd0-97bd-39cd0a0b070b/console/0.log" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.918328 4762 generic.go:334] "Generic (PLEG): container finished" podID="090e1d23-2437-4cd0-97bd-39cd0a0b070b" containerID="aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10" exitCode=2 Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.918364 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cb59b7fc9-c5ld6" event={"ID":"090e1d23-2437-4cd0-97bd-39cd0a0b070b","Type":"ContainerDied","Data":"aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10"} Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.918414 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cb59b7fc9-c5ld6" Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.918424 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cb59b7fc9-c5ld6" event={"ID":"090e1d23-2437-4cd0-97bd-39cd0a0b070b","Type":"ContainerDied","Data":"f5d25532ede6ae0a6c9b418a5369c9f9e202e9a888ea01b4cb3a8256d710235c"} Feb 17 14:14:35 crc kubenswrapper[4762]: I0217 14:14:35.918445 4762 scope.go:117] "RemoveContainer" containerID="aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10" Feb 17 14:14:36 crc kubenswrapper[4762]: I0217 14:14:36.007150 4762 scope.go:117] "RemoveContainer" containerID="aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10" Feb 17 14:14:36 crc kubenswrapper[4762]: E0217 14:14:36.007550 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10\": container with ID starting with aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10 not found: ID does not exist" containerID="aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10" Feb 17 14:14:36 crc kubenswrapper[4762]: I0217 14:14:36.007592 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10"} err="failed to get container status \"aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10\": rpc error: code = NotFound desc = could not find container \"aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10\": container with ID starting with aaaac50c9b636c37140a7964436685268d7fc8827c7fe1a0e99fec6c8b558f10 not found: ID does not exist" Feb 17 14:14:36 crc kubenswrapper[4762]: I0217 14:14:36.011804 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cb59b7fc9-c5ld6"] Feb 17 14:14:36 crc kubenswrapper[4762]: I0217 14:14:36.014351 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cb59b7fc9-c5ld6"] Feb 17 14:14:36 crc kubenswrapper[4762]: I0217 14:14:36.078999 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090e1d23-2437-4cd0-97bd-39cd0a0b070b" path="/var/lib/kubelet/pods/090e1d23-2437-4cd0-97bd-39cd0a0b070b/volumes" Feb 17 14:14:54 crc kubenswrapper[4762]: I0217 14:14:54.622384 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:14:54 crc kubenswrapper[4762]: I0217 14:14:54.623049 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.221112 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2"] Feb 17 14:15:00 crc kubenswrapper[4762]: E0217 14:15:00.221731 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090e1d23-2437-4cd0-97bd-39cd0a0b070b" containerName="console" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.221749 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="090e1d23-2437-4cd0-97bd-39cd0a0b070b" containerName="console" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.221893 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="090e1d23-2437-4cd0-97bd-39cd0a0b070b" containerName="console" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.222387 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.225427 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.225662 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.244320 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2"] Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.358247 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5857520-9cb4-4bec-b0e7-09b2ba661150-secret-volume\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.358598 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkb2x\" (UniqueName: \"kubernetes.io/projected/c5857520-9cb4-4bec-b0e7-09b2ba661150-kube-api-access-hkb2x\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.358797 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5857520-9cb4-4bec-b0e7-09b2ba661150-config-volume\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.459836 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5857520-9cb4-4bec-b0e7-09b2ba661150-config-volume\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.460082 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5857520-9cb4-4bec-b0e7-09b2ba661150-secret-volume\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.460118 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkb2x\" (UniqueName: \"kubernetes.io/projected/c5857520-9cb4-4bec-b0e7-09b2ba661150-kube-api-access-hkb2x\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.460930 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5857520-9cb4-4bec-b0e7-09b2ba661150-config-volume\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.466939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5857520-9cb4-4bec-b0e7-09b2ba661150-secret-volume\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.484334 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkb2x\" (UniqueName: \"kubernetes.io/projected/c5857520-9cb4-4bec-b0e7-09b2ba661150-kube-api-access-hkb2x\") pod \"collect-profiles-29522295-b4nf2\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.541761 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:00 crc kubenswrapper[4762]: I0217 14:15:00.761210 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2"] Feb 17 14:15:01 crc kubenswrapper[4762]: I0217 14:15:01.198524 4762 generic.go:334] "Generic (PLEG): container finished" podID="c5857520-9cb4-4bec-b0e7-09b2ba661150" containerID="0b48692c02352a92ac95d2d5b2fa5703bd72e8ab666d4617753c3dce3b6feb3d" exitCode=0 Feb 17 14:15:01 crc kubenswrapper[4762]: I0217 14:15:01.198628 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" event={"ID":"c5857520-9cb4-4bec-b0e7-09b2ba661150","Type":"ContainerDied","Data":"0b48692c02352a92ac95d2d5b2fa5703bd72e8ab666d4617753c3dce3b6feb3d"} Feb 17 14:15:01 crc kubenswrapper[4762]: I0217 14:15:01.198987 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" event={"ID":"c5857520-9cb4-4bec-b0e7-09b2ba661150","Type":"ContainerStarted","Data":"c8b09283b243259c4de25a346e8d6a0a2fe1dfa25c1c4eb54767bb9fa58235a2"} Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.476265 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.618801 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkb2x\" (UniqueName: \"kubernetes.io/projected/c5857520-9cb4-4bec-b0e7-09b2ba661150-kube-api-access-hkb2x\") pod \"c5857520-9cb4-4bec-b0e7-09b2ba661150\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.618931 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5857520-9cb4-4bec-b0e7-09b2ba661150-config-volume\") pod \"c5857520-9cb4-4bec-b0e7-09b2ba661150\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.619019 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5857520-9cb4-4bec-b0e7-09b2ba661150-secret-volume\") pod \"c5857520-9cb4-4bec-b0e7-09b2ba661150\" (UID: \"c5857520-9cb4-4bec-b0e7-09b2ba661150\") " Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.621058 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5857520-9cb4-4bec-b0e7-09b2ba661150-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5857520-9cb4-4bec-b0e7-09b2ba661150" (UID: "c5857520-9cb4-4bec-b0e7-09b2ba661150"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.625327 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5857520-9cb4-4bec-b0e7-09b2ba661150-kube-api-access-hkb2x" (OuterVolumeSpecName: "kube-api-access-hkb2x") pod "c5857520-9cb4-4bec-b0e7-09b2ba661150" (UID: "c5857520-9cb4-4bec-b0e7-09b2ba661150"). InnerVolumeSpecName "kube-api-access-hkb2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.625410 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5857520-9cb4-4bec-b0e7-09b2ba661150-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5857520-9cb4-4bec-b0e7-09b2ba661150" (UID: "c5857520-9cb4-4bec-b0e7-09b2ba661150"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.721630 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkb2x\" (UniqueName: \"kubernetes.io/projected/c5857520-9cb4-4bec-b0e7-09b2ba661150-kube-api-access-hkb2x\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.721681 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5857520-9cb4-4bec-b0e7-09b2ba661150-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:02 crc kubenswrapper[4762]: I0217 14:15:02.721692 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5857520-9cb4-4bec-b0e7-09b2ba661150-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4762]: I0217 14:15:03.214787 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" event={"ID":"c5857520-9cb4-4bec-b0e7-09b2ba661150","Type":"ContainerDied","Data":"c8b09283b243259c4de25a346e8d6a0a2fe1dfa25c1c4eb54767bb9fa58235a2"} Feb 17 14:15:03 crc kubenswrapper[4762]: I0217 14:15:03.214833 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b09283b243259c4de25a346e8d6a0a2fe1dfa25c1c4eb54767bb9fa58235a2" Feb 17 14:15:03 crc kubenswrapper[4762]: I0217 14:15:03.214854 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-b4nf2" Feb 17 14:15:24 crc kubenswrapper[4762]: I0217 14:15:24.621375 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:15:24 crc kubenswrapper[4762]: I0217 14:15:24.621879 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:15:24 crc kubenswrapper[4762]: I0217 14:15:24.621919 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:15:24 crc kubenswrapper[4762]: I0217 14:15:24.622465 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"817296b81932e51cfaf5f5110e46a8a500731db1cf4d8ef393c04d896b5ebe8b"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:15:24 crc kubenswrapper[4762]: I0217 14:15:24.622509 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://817296b81932e51cfaf5f5110e46a8a500731db1cf4d8ef393c04d896b5ebe8b" gracePeriod=600 Feb 17 14:15:25 crc kubenswrapper[4762]: I0217 14:15:25.369275 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="817296b81932e51cfaf5f5110e46a8a500731db1cf4d8ef393c04d896b5ebe8b" exitCode=0 Feb 17 14:15:25 crc kubenswrapper[4762]: I0217 14:15:25.369349 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"817296b81932e51cfaf5f5110e46a8a500731db1cf4d8ef393c04d896b5ebe8b"} Feb 17 14:15:25 crc kubenswrapper[4762]: I0217 14:15:25.369559 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"116572c4d79b2feaa81621e7ad3ce8410516799fe8d9dbdb26dfeae29390b841"} Feb 17 14:15:25 crc kubenswrapper[4762]: I0217 14:15:25.369581 4762 scope.go:117] "RemoveContainer" containerID="b5d43767687fdd610ba4f9520d77c20e66f875c84b97c517f7b3ba8e012bd4b7" Feb 17 14:17:24 crc kubenswrapper[4762]: I0217 14:17:24.621817 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:17:24 crc kubenswrapper[4762]: I0217 14:17:24.622371 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.245319 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz"] Feb 17 14:17:54 crc kubenswrapper[4762]: E0217 14:17:54.246056 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5857520-9cb4-4bec-b0e7-09b2ba661150" containerName="collect-profiles" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.246069 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5857520-9cb4-4bec-b0e7-09b2ba661150" containerName="collect-profiles" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.246195 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5857520-9cb4-4bec-b0e7-09b2ba661150" containerName="collect-profiles" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.247034 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.249504 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.258216 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz"] Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.348953 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.349332 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.349397 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpv4z\" (UniqueName: \"kubernetes.io/projected/2c0144bd-21f9-4515-909e-dfc320b5e239-kube-api-access-wpv4z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.450989 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.451063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.451103 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpv4z\" (UniqueName: \"kubernetes.io/projected/2c0144bd-21f9-4515-909e-dfc320b5e239-kube-api-access-wpv4z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.451545 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.451717 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.470142 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpv4z\" (UniqueName: \"kubernetes.io/projected/2c0144bd-21f9-4515-909e-dfc320b5e239-kube-api-access-wpv4z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.564975 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.621403 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.621469 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.785059 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz"] Feb 17 14:17:54 crc kubenswrapper[4762]: I0217 14:17:54.858863 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerStarted","Data":"0e37a9888212f47ca2715e76867f5dbfbf12cab8890c565e68600bd3ffcda313"} Feb 17 14:17:55 crc kubenswrapper[4762]: I0217 14:17:55.866586 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerStarted","Data":"f50f6ef4dc253d881320a0023fabf1bf819ffcfc233fe3858f34c440723d390c"} Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.549170 4762 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.602013 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvswd"] Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.603311 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.617384 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvswd"] Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.684163 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-utilities\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.684236 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-catalog-content\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.684280 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tss9n\" (UniqueName: \"kubernetes.io/projected/99fa7921-3767-449e-a15c-cfb265cd16a2-kube-api-access-tss9n\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.785669 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-catalog-content\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.785756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tss9n\" (UniqueName: \"kubernetes.io/projected/99fa7921-3767-449e-a15c-cfb265cd16a2-kube-api-access-tss9n\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.785833 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-utilities\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.786311 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-catalog-content\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.786336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-utilities\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.808593 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tss9n\" (UniqueName: \"kubernetes.io/projected/99fa7921-3767-449e-a15c-cfb265cd16a2-kube-api-access-tss9n\") pod \"redhat-operators-wvswd\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.873708 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerID="f50f6ef4dc253d881320a0023fabf1bf819ffcfc233fe3858f34c440723d390c" exitCode=0 Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.873751 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerDied","Data":"f50f6ef4dc253d881320a0023fabf1bf819ffcfc233fe3858f34c440723d390c"} Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.879006 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:17:56 crc kubenswrapper[4762]: I0217 14:17:56.974087 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:17:57 crc kubenswrapper[4762]: I0217 14:17:57.531198 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvswd"] Feb 17 14:17:57 crc kubenswrapper[4762]: I0217 14:17:57.880574 4762 generic.go:334] "Generic (PLEG): container finished" podID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerID="c6332179c33b0fd35f102a359076f9b54e2ffcfdc44120325909f67486948bdc" exitCode=0 Feb 17 14:17:57 crc kubenswrapper[4762]: I0217 14:17:57.880707 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerDied","Data":"c6332179c33b0fd35f102a359076f9b54e2ffcfdc44120325909f67486948bdc"} Feb 17 14:17:57 crc kubenswrapper[4762]: I0217 14:17:57.880940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerStarted","Data":"7734a9cbfe688e81763ba15e89047c2b00defa3b4daf7604134229846ce6a2dd"} Feb 17 14:17:58 crc kubenswrapper[4762]: I0217 14:17:58.894054 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerID="065ebc30a1c632629767246811e8267b1e579b08892c26fff6ff1af5f5e6348e" exitCode=0 Feb 17 14:17:58 crc kubenswrapper[4762]: I0217 14:17:58.894172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerDied","Data":"065ebc30a1c632629767246811e8267b1e579b08892c26fff6ff1af5f5e6348e"} Feb 17 14:17:58 crc kubenswrapper[4762]: I0217 14:17:58.899570 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerStarted","Data":"ec5888efafd82f4032e941295a8b012724d6bdad845cb96c969c39b6142b8a56"} Feb 17 14:17:59 crc kubenswrapper[4762]: I0217 14:17:59.907676 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerStarted","Data":"fc5a1d8912f406087863dc4f1b7e67e37ff5b1af65984a0662d0d95b4d397c08"} Feb 17 14:17:59 crc kubenswrapper[4762]: I0217 14:17:59.926784 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" podStartSLOduration=4.941562739 podStartE2EDuration="5.926761044s" podCreationTimestamp="2026-02-17 14:17:54 +0000 UTC" firstStartedPulling="2026-02-17 14:17:56.878706732 +0000 UTC m=+757.458707384" lastFinishedPulling="2026-02-17 14:17:57.863905037 +0000 UTC m=+758.443905689" observedRunningTime="2026-02-17 14:17:59.922016334 +0000 UTC m=+760.502016986" watchObservedRunningTime="2026-02-17 14:17:59.926761044 +0000 UTC m=+760.506761696" Feb 17 14:18:01 crc kubenswrapper[4762]: I0217 14:18:01.939322 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerID="fc5a1d8912f406087863dc4f1b7e67e37ff5b1af65984a0662d0d95b4d397c08" exitCode=0 Feb 17 14:18:01 crc kubenswrapper[4762]: I0217 14:18:01.939375 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerDied","Data":"fc5a1d8912f406087863dc4f1b7e67e37ff5b1af65984a0662d0d95b4d397c08"} Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.163432 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.568755 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpv4z\" (UniqueName: \"kubernetes.io/projected/2c0144bd-21f9-4515-909e-dfc320b5e239-kube-api-access-wpv4z\") pod \"2c0144bd-21f9-4515-909e-dfc320b5e239\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.568906 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-util\") pod \"2c0144bd-21f9-4515-909e-dfc320b5e239\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.568996 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-bundle\") pod \"2c0144bd-21f9-4515-909e-dfc320b5e239\" (UID: \"2c0144bd-21f9-4515-909e-dfc320b5e239\") " Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.570580 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-bundle" (OuterVolumeSpecName: "bundle") pod "2c0144bd-21f9-4515-909e-dfc320b5e239" (UID: "2c0144bd-21f9-4515-909e-dfc320b5e239"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.581133 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-util" (OuterVolumeSpecName: "util") pod "2c0144bd-21f9-4515-909e-dfc320b5e239" (UID: "2c0144bd-21f9-4515-909e-dfc320b5e239"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.581991 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0144bd-21f9-4515-909e-dfc320b5e239-kube-api-access-wpv4z" (OuterVolumeSpecName: "kube-api-access-wpv4z") pod "2c0144bd-21f9-4515-909e-dfc320b5e239" (UID: "2c0144bd-21f9-4515-909e-dfc320b5e239"). InnerVolumeSpecName "kube-api-access-wpv4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.670706 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.670741 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0144bd-21f9-4515-909e-dfc320b5e239-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.670754 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpv4z\" (UniqueName: \"kubernetes.io/projected/2c0144bd-21f9-4515-909e-dfc320b5e239-kube-api-access-wpv4z\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.754330 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vksr"] Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.754922 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-controller" containerID="cri-o://625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.755002 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="nbdb" containerID="cri-o://7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.755074 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="northd" containerID="cri-o://327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.755160 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-acl-logging" containerID="cri-o://a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.755222 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-node" containerID="cri-o://b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.755328 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="sbdb" containerID="cri-o://3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.756482 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.791371 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" containerID="cri-o://b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" gracePeriod=30 Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.972342 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" event={"ID":"2c0144bd-21f9-4515-909e-dfc320b5e239","Type":"ContainerDied","Data":"0e37a9888212f47ca2715e76867f5dbfbf12cab8890c565e68600bd3ffcda313"} Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.972785 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e37a9888212f47ca2715e76867f5dbfbf12cab8890c565e68600bd3ffcda313" Feb 17 14:18:04 crc kubenswrapper[4762]: I0217 14:18:04.972404 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.809083 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/3.log" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.811917 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovn-acl-logging/0.log" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.812401 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovn-controller/0.log" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.812912 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857740 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-systemd-units\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857814 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-ovn\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857863 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab134be0-88ef-45ac-80e0-963a60169ad2-ovn-node-metrics-cert\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857889 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-systemd\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857907 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-bin\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857941 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-node-log\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.857972 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-script-lib\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858025 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-config\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858050 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-log-socket\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858080 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-var-lib-openvswitch\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858304 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-node-log" (OuterVolumeSpecName: "node-log") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858342 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858430 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858488 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-log-socket" (OuterVolumeSpecName: "log-socket") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858913 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.858945 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.859231 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862535 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kzpnp"] Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862848 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="northd" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862864 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="northd" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862874 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862879 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862888 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="nbdb" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862894 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="nbdb" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862901 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="extract" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862907 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="extract" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862920 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862925 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862950 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kubecfg-setup" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862955 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kubecfg-setup" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862964 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="pull" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862970 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="pull" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862979 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862985 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.862991 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-acl-logging" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.862996 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-acl-logging" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863003 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863009 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863019 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="util" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863024 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="util" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863036 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863042 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863049 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="sbdb" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863054 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="sbdb" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863062 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-node" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863068 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-node" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863222 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="sbdb" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863253 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0144bd-21f9-4515-909e-dfc320b5e239" containerName="extract" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863262 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863273 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-acl-logging" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863282 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863290 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="nbdb" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863296 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="kube-rbac-proxy-node" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863306 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863317 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863326 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovn-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863338 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="northd" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863350 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863522 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863534 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863741 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: E0217 14:18:06.863915 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.863926 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerName="ovnkube-controller" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.865531 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab134be0-88ef-45ac-80e0-963a60169ad2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.881157 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.887304 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959412 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-etc-openvswitch\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959754 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-netd\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959529 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959802 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-slash\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959833 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959837 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-openvswitch\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959885 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959894 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-netns\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.959924 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-slash" (OuterVolumeSpecName: "host-slash") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960023 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960013 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-ovn-kubernetes\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960049 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960068 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-kubelet\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960095 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960126 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-env-overrides\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960159 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m5t9\" (UniqueName: \"kubernetes.io/projected/ab134be0-88ef-45ac-80e0-963a60169ad2-kube-api-access-8m5t9\") pod \"ab134be0-88ef-45ac-80e0-963a60169ad2\" (UID: \"ab134be0-88ef-45ac-80e0-963a60169ad2\") " Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-kubelet\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960475 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960502 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-ovnkube-config\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960527 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-slash\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960565 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-systemd\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960634 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-run-netns\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-run-ovn-kubernetes\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960712 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-env-overrides\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960751 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k959\" (UniqueName: \"kubernetes.io/projected/3a383139-de98-4e23-92ce-df401c79b08c-kube-api-access-2k959\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960772 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-cni-netd\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960801 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-ovnkube-script-lib\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960862 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960885 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-var-lib-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960908 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-systemd-units\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960946 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-log-socket\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.960966 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-node-log\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961002 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a383139-de98-4e23-92ce-df401c79b08c-ovn-node-metrics-cert\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961039 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-cni-bin\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961059 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-ovn\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961081 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-etc-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961167 4762 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961177 4762 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961187 4762 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961196 4762 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961205 4762 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961213 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab134be0-88ef-45ac-80e0-963a60169ad2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961221 4762 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961229 4762 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961237 4762 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961245 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961253 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961261 4762 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961269 4762 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961286 4762 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961294 4762 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961302 4762 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961341 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961357 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.961630 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:18:06 crc kubenswrapper[4762]: I0217 14:18:06.965258 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab134be0-88ef-45ac-80e0-963a60169ad2-kube-api-access-8m5t9" (OuterVolumeSpecName: "kube-api-access-8m5t9") pod "ab134be0-88ef-45ac-80e0-963a60169ad2" (UID: "ab134be0-88ef-45ac-80e0-963a60169ad2"). InnerVolumeSpecName "kube-api-access-8m5t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.003499 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/2.log" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.004004 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/1.log" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.004056 4762 generic.go:334] "Generic (PLEG): container finished" podID="c1057884-d2c5-4911-9b97-fb4fedba9ab1" containerID="2180feb9a7871567c44d5f79b87d557e3bcdb1bc5b223e164d5df42091fc7302" exitCode=2 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.004139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerDied","Data":"2180feb9a7871567c44d5f79b87d557e3bcdb1bc5b223e164d5df42091fc7302"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.004211 4762 scope.go:117] "RemoveContainer" containerID="97b30da58ae2262858da3a6bc5331e386975ce75aea8ae63239fdba83d50a9e3" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.004896 4762 scope.go:117] "RemoveContainer" containerID="2180feb9a7871567c44d5f79b87d557e3bcdb1bc5b223e164d5df42091fc7302" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.006682 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovnkube-controller/3.log" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011019 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovn-acl-logging/0.log" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011534 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vksr_ab134be0-88ef-45ac-80e0-963a60169ad2/ovn-controller/0.log" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011841 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011867 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011876 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011888 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011897 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011905 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011913 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" exitCode=143 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011920 4762 generic.go:334] "Generic (PLEG): container finished" podID="ab134be0-88ef-45ac-80e0-963a60169ad2" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" exitCode=143 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.011969 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012001 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012218 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012243 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012255 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012266 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012279 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012285 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012290 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012295 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012300 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012306 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012314 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012321 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012328 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012337 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012348 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012355 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012361 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012368 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012375 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012382 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012404 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012411 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012417 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012424 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012442 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012449 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012456 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012466 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012472 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012479 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012485 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012491 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012497 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012504 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012514 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" event={"ID":"ab134be0-88ef-45ac-80e0-963a60169ad2","Type":"ContainerDied","Data":"68b1affc067a8160a4de26baac09a6bc0782eec9060a2a6bcba2732a213a64e4"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012523 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012530 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012536 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012542 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012548 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012554 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012561 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012567 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012573 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012580 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.012721 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vksr" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.019188 4762 generic.go:334] "Generic (PLEG): container finished" podID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerID="ec5888efafd82f4032e941295a8b012724d6bdad845cb96c969c39b6142b8a56" exitCode=0 Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.019231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerDied","Data":"ec5888efafd82f4032e941295a8b012724d6bdad845cb96c969c39b6142b8a56"} Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.044563 4762 scope.go:117] "RemoveContainer" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065026 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-cni-bin\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065100 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-ovn\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065126 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-etc-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065186 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-kubelet\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065204 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065252 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-slash\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065282 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-ovnkube-config\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065323 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-systemd\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065357 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-run-netns\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065392 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-etc-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065431 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-run-ovn-kubernetes\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065406 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-run-ovn-kubernetes\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065495 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-env-overrides\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065522 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k959\" (UniqueName: \"kubernetes.io/projected/3a383139-de98-4e23-92ce-df401c79b08c-kube-api-access-2k959\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-cni-netd\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-kubelet\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065889 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-slash\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065904 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-cni-netd\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065905 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065934 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-run-netns\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065938 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-host-cni-bin\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.065967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-ovn\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-systemd\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066087 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-ovnkube-script-lib\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066492 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066536 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-var-lib-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066561 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-systemd-units\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066607 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-env-overrides\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066626 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-log-socket\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066701 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-node-log\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066718 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-var-lib-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066765 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-ovnkube-config\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066812 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-systemd-units\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066909 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a383139-de98-4e23-92ce-df401c79b08c-ovnkube-script-lib\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066957 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-log-socket\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.067012 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-node-log\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.066762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a383139-de98-4e23-92ce-df401c79b08c-ovn-node-metrics-cert\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.067167 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a383139-de98-4e23-92ce-df401c79b08c-run-openvswitch\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.067199 4762 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.067213 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab134be0-88ef-45ac-80e0-963a60169ad2-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.067224 4762 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab134be0-88ef-45ac-80e0-963a60169ad2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.067237 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m5t9\" (UniqueName: \"kubernetes.io/projected/ab134be0-88ef-45ac-80e0-963a60169ad2-kube-api-access-8m5t9\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.068419 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.069018 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vksr"] Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.081752 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a383139-de98-4e23-92ce-df401c79b08c-ovn-node-metrics-cert\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.099354 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k959\" (UniqueName: \"kubernetes.io/projected/3a383139-de98-4e23-92ce-df401c79b08c-kube-api-access-2k959\") pod \"ovnkube-node-kzpnp\" (UID: \"3a383139-de98-4e23-92ce-df401c79b08c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.101459 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vksr"] Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.105800 4762 scope.go:117] "RemoveContainer" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.145762 4762 scope.go:117] "RemoveContainer" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.162524 4762 scope.go:117] "RemoveContainer" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.177834 4762 scope.go:117] "RemoveContainer" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.197581 4762 scope.go:117] "RemoveContainer" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.211102 4762 scope.go:117] "RemoveContainer" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.222998 4762 scope.go:117] "RemoveContainer" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.226797 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.252857 4762 scope.go:117] "RemoveContainer" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.324956 4762 scope.go:117] "RemoveContainer" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.325403 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": container with ID starting with b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882 not found: ID does not exist" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.325444 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} err="failed to get container status \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": rpc error: code = NotFound desc = could not find container \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": container with ID starting with b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.325483 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.325973 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": container with ID starting with 12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d not found: ID does not exist" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.326020 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} err="failed to get container status \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": rpc error: code = NotFound desc = could not find container \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": container with ID starting with 12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.326070 4762 scope.go:117] "RemoveContainer" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.326418 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": container with ID starting with 3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078 not found: ID does not exist" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.326457 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} err="failed to get container status \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": rpc error: code = NotFound desc = could not find container \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": container with ID starting with 3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.326486 4762 scope.go:117] "RemoveContainer" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.326956 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": container with ID starting with 7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7 not found: ID does not exist" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.326996 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} err="failed to get container status \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": rpc error: code = NotFound desc = could not find container \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": container with ID starting with 7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.327026 4762 scope.go:117] "RemoveContainer" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.327338 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": container with ID starting with 327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b not found: ID does not exist" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.327386 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} err="failed to get container status \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": rpc error: code = NotFound desc = could not find container \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": container with ID starting with 327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.327408 4762 scope.go:117] "RemoveContainer" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.327893 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": container with ID starting with c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd not found: ID does not exist" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.327922 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} err="failed to get container status \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": rpc error: code = NotFound desc = could not find container \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": container with ID starting with c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.327940 4762 scope.go:117] "RemoveContainer" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.333808 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": container with ID starting with b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d not found: ID does not exist" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.333867 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} err="failed to get container status \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": rpc error: code = NotFound desc = could not find container \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": container with ID starting with b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.333899 4762 scope.go:117] "RemoveContainer" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.334444 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": container with ID starting with a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0 not found: ID does not exist" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.334489 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} err="failed to get container status \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": rpc error: code = NotFound desc = could not find container \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": container with ID starting with a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.334523 4762 scope.go:117] "RemoveContainer" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.335002 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": container with ID starting with 625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3 not found: ID does not exist" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.335026 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} err="failed to get container status \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": rpc error: code = NotFound desc = could not find container \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": container with ID starting with 625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.335050 4762 scope.go:117] "RemoveContainer" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" Feb 17 14:18:07 crc kubenswrapper[4762]: E0217 14:18:07.335420 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": container with ID starting with f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed not found: ID does not exist" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.335443 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} err="failed to get container status \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": rpc error: code = NotFound desc = could not find container \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": container with ID starting with f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.335459 4762 scope.go:117] "RemoveContainer" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.335796 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} err="failed to get container status \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": rpc error: code = NotFound desc = could not find container \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": container with ID starting with b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.335820 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.336113 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} err="failed to get container status \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": rpc error: code = NotFound desc = could not find container \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": container with ID starting with 12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.336138 4762 scope.go:117] "RemoveContainer" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.336471 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} err="failed to get container status \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": rpc error: code = NotFound desc = could not find container \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": container with ID starting with 3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.336500 4762 scope.go:117] "RemoveContainer" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.336923 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} err="failed to get container status \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": rpc error: code = NotFound desc = could not find container \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": container with ID starting with 7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.336951 4762 scope.go:117] "RemoveContainer" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.337207 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} err="failed to get container status \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": rpc error: code = NotFound desc = could not find container \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": container with ID starting with 327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.337230 4762 scope.go:117] "RemoveContainer" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.337725 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} err="failed to get container status \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": rpc error: code = NotFound desc = could not find container \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": container with ID starting with c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.337755 4762 scope.go:117] "RemoveContainer" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.338030 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} err="failed to get container status \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": rpc error: code = NotFound desc = could not find container \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": container with ID starting with b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.338058 4762 scope.go:117] "RemoveContainer" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.338318 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} err="failed to get container status \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": rpc error: code = NotFound desc = could not find container \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": container with ID starting with a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.338346 4762 scope.go:117] "RemoveContainer" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.338805 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} err="failed to get container status \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": rpc error: code = NotFound desc = could not find container \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": container with ID starting with 625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.338838 4762 scope.go:117] "RemoveContainer" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.339209 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} err="failed to get container status \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": rpc error: code = NotFound desc = could not find container \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": container with ID starting with f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.339260 4762 scope.go:117] "RemoveContainer" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.339589 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} err="failed to get container status \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": rpc error: code = NotFound desc = could not find container \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": container with ID starting with b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.339623 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.339905 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} err="failed to get container status \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": rpc error: code = NotFound desc = could not find container \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": container with ID starting with 12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.339934 4762 scope.go:117] "RemoveContainer" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.340287 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} err="failed to get container status \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": rpc error: code = NotFound desc = could not find container \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": container with ID starting with 3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.340313 4762 scope.go:117] "RemoveContainer" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.340578 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} err="failed to get container status \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": rpc error: code = NotFound desc = could not find container \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": container with ID starting with 7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.340605 4762 scope.go:117] "RemoveContainer" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.340942 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} err="failed to get container status \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": rpc error: code = NotFound desc = could not find container \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": container with ID starting with 327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.340962 4762 scope.go:117] "RemoveContainer" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.341294 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} err="failed to get container status \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": rpc error: code = NotFound desc = could not find container \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": container with ID starting with c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.341445 4762 scope.go:117] "RemoveContainer" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.342475 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} err="failed to get container status \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": rpc error: code = NotFound desc = could not find container \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": container with ID starting with b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.342497 4762 scope.go:117] "RemoveContainer" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.342876 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} err="failed to get container status \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": rpc error: code = NotFound desc = could not find container \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": container with ID starting with a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.342905 4762 scope.go:117] "RemoveContainer" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.343190 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} err="failed to get container status \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": rpc error: code = NotFound desc = could not find container \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": container with ID starting with 625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.343221 4762 scope.go:117] "RemoveContainer" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.343507 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} err="failed to get container status \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": rpc error: code = NotFound desc = could not find container \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": container with ID starting with f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.343534 4762 scope.go:117] "RemoveContainer" containerID="b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.343829 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882"} err="failed to get container status \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": rpc error: code = NotFound desc = could not find container \"b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882\": container with ID starting with b323436612df2bde0829fb1fa6800b42501432f7a312888dfcb449e277410882 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.343855 4762 scope.go:117] "RemoveContainer" containerID="12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.350186 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d"} err="failed to get container status \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": rpc error: code = NotFound desc = could not find container \"12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d\": container with ID starting with 12d56b66f4331f33604c031c7aa2f9c6082c6329552bc0ef778893cf96ac577d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.350237 4762 scope.go:117] "RemoveContainer" containerID="3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.351240 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078"} err="failed to get container status \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": rpc error: code = NotFound desc = could not find container \"3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078\": container with ID starting with 3f9f8df05e1b1c823b52957e3c4053a5f01224d8fef6678c1d94953843316078 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.351276 4762 scope.go:117] "RemoveContainer" containerID="7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.351921 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7"} err="failed to get container status \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": rpc error: code = NotFound desc = could not find container \"7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7\": container with ID starting with 7499a8d278063cea64d38dc864b97596dd441360b27d06d8815c5c30559615c7 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.351953 4762 scope.go:117] "RemoveContainer" containerID="327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.352294 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b"} err="failed to get container status \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": rpc error: code = NotFound desc = could not find container \"327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b\": container with ID starting with 327cf55b0f2ef6da9a93fecdfa3ab696c5f7f0ebc826baa8b1670bd9940a0c8b not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.352336 4762 scope.go:117] "RemoveContainer" containerID="c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.352739 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd"} err="failed to get container status \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": rpc error: code = NotFound desc = could not find container \"c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd\": container with ID starting with c4469fdbead6797d26c36a8be7d0f2f22a3145dfc0f977b299e364bb33632fcd not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.352760 4762 scope.go:117] "RemoveContainer" containerID="b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353009 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d"} err="failed to get container status \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": rpc error: code = NotFound desc = could not find container \"b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d\": container with ID starting with b5e435cdd9dfce8c3cc65556a76f43109dc6b41a4d856c4e5509cae3e5ce449d not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353033 4762 scope.go:117] "RemoveContainer" containerID="a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353227 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0"} err="failed to get container status \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": rpc error: code = NotFound desc = could not find container \"a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0\": container with ID starting with a72a882fa302af70d1f8148d3cce6a44932742c90c6e3501db360932514172a0 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353259 4762 scope.go:117] "RemoveContainer" containerID="625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353420 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3"} err="failed to get container status \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": rpc error: code = NotFound desc = could not find container \"625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3\": container with ID starting with 625632ad97fc5302f62c28826ab4f62c6f6c96dfc864712c1e26cab8500ea6a3 not found: ID does not exist" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353436 4762 scope.go:117] "RemoveContainer" containerID="f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed" Feb 17 14:18:07 crc kubenswrapper[4762]: I0217 14:18:07.353782 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed"} err="failed to get container status \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": rpc error: code = NotFound desc = could not find container \"f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed\": container with ID starting with f80e6d0e2c3a7a21de38b41e53e0ae108addb1fbec24d733dbddcf68911819ed not found: ID does not exist" Feb 17 14:18:08 crc kubenswrapper[4762]: I0217 14:18:08.026507 4762 generic.go:334] "Generic (PLEG): container finished" podID="3a383139-de98-4e23-92ce-df401c79b08c" containerID="681e8e5f86dec221c35ac41eb3e0a601a8fa08dce47877b4de6a55990eed7d30" exitCode=0 Feb 17 14:18:08 crc kubenswrapper[4762]: I0217 14:18:08.026559 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerDied","Data":"681e8e5f86dec221c35ac41eb3e0a601a8fa08dce47877b4de6a55990eed7d30"} Feb 17 14:18:08 crc kubenswrapper[4762]: I0217 14:18:08.026609 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"c4e7d303762df338508af4fd8a06fa227c631a517ac1befe4f1363532de9eea6"} Feb 17 14:18:08 crc kubenswrapper[4762]: I0217 14:18:08.031427 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r7p8_c1057884-d2c5-4911-9b97-fb4fedba9ab1/kube-multus/2.log" Feb 17 14:18:08 crc kubenswrapper[4762]: I0217 14:18:08.031494 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r7p8" event={"ID":"c1057884-d2c5-4911-9b97-fb4fedba9ab1","Type":"ContainerStarted","Data":"5e82fc894ade1050c2f9c7882b00818716d9c759c4d99f56caeb2540bfd6499a"} Feb 17 14:18:08 crc kubenswrapper[4762]: I0217 14:18:08.083160 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab134be0-88ef-45ac-80e0-963a60169ad2" path="/var/lib/kubelet/pods/ab134be0-88ef-45ac-80e0-963a60169ad2/volumes" Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.040875 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"cdd52ffd38461c7d779ea83aee4d1ba5611208f92cf5108faed3e9d915436001"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.041184 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"2aee6cd67563da1ac47437f2834b54f63f4af6b589f72188ab93f264eba9c1f9"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.041199 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"8998c55f531d4a1ebc9b0d3028678750d2a3c4ce831f1aea4fee34c1d0820d1d"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.041211 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"4c2c649936246685c6382f99ee5756f10508dc93a90f566b1b9b2a116ed00d8e"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.041220 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"ffc300ec25c40cb9c12e6cc4f020d2f6dcc10d12186745f50fd738aeea63e578"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.041230 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"c3f8f090647c67e21fbaf5f6edca56362b8a0d44d66915eeb0fc574101cffb3e"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.043464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerStarted","Data":"c95358a747f73f9ccb6c3c907bbecf3dcf957be91c2207225a5df83b9cd4b5e4"} Feb 17 14:18:09 crc kubenswrapper[4762]: I0217 14:18:09.063573 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvswd" podStartSLOduration=3.424387328 podStartE2EDuration="13.063556732s" podCreationTimestamp="2026-02-17 14:17:56 +0000 UTC" firstStartedPulling="2026-02-17 14:17:57.881986191 +0000 UTC m=+758.461986843" lastFinishedPulling="2026-02-17 14:18:07.521155595 +0000 UTC m=+768.101156247" observedRunningTime="2026-02-17 14:18:09.06090869 +0000 UTC m=+769.640909352" watchObservedRunningTime="2026-02-17 14:18:09.063556732 +0000 UTC m=+769.643557384" Feb 17 14:18:12 crc kubenswrapper[4762]: I0217 14:18:12.091034 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"705e7759064cd1e6860340b38424120534541db455dccd40b630ebf701118901"} Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.120005 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" event={"ID":"3a383139-de98-4e23-92ce-df401c79b08c","Type":"ContainerStarted","Data":"cdbac045d6698304d7cb341c944c1255d3be25847f2bec32f039c975e1a2809c"} Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.120786 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.120867 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.120878 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.272421 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.375869 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:15 crc kubenswrapper[4762]: I0217 14:18:15.656209 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" podStartSLOduration=9.656189467 podStartE2EDuration="9.656189467s" podCreationTimestamp="2026-02-17 14:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:18:15.652586739 +0000 UTC m=+776.232587411" watchObservedRunningTime="2026-02-17 14:18:15.656189467 +0000 UTC m=+776.236190119" Feb 17 14:18:16 crc kubenswrapper[4762]: I0217 14:18:16.975780 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:18:16 crc kubenswrapper[4762]: I0217 14:18:16.977242 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:18:18 crc kubenswrapper[4762]: I0217 14:18:18.152743 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvswd" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="registry-server" probeResult="failure" output=< Feb 17 14:18:18 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:18:18 crc kubenswrapper[4762]: > Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.733273 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw"] Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.734594 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.736343 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.736539 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-njmwl" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.736928 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.752887 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wkj7\" (UniqueName: \"kubernetes.io/projected/d135e9df-e707-48e4-a0ad-0d400cb5b0c8-kube-api-access-9wkj7\") pod \"obo-prometheus-operator-68bc856cb9-csbmw\" (UID: \"d135e9df-e707-48e4-a0ad-0d400cb5b0c8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.756398 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw"] Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.854293 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wkj7\" (UniqueName: \"kubernetes.io/projected/d135e9df-e707-48e4-a0ad-0d400cb5b0c8-kube-api-access-9wkj7\") pod \"obo-prometheus-operator-68bc856cb9-csbmw\" (UID: \"d135e9df-e707-48e4-a0ad-0d400cb5b0c8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.860291 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r"] Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.861054 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.867837 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-p26b5" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.868197 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.878791 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r"] Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.888206 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wkj7\" (UniqueName: \"kubernetes.io/projected/d135e9df-e707-48e4-a0ad-0d400cb5b0c8-kube-api-access-9wkj7\") pod \"obo-prometheus-operator-68bc856cb9-csbmw\" (UID: \"d135e9df-e707-48e4-a0ad-0d400cb5b0c8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.895388 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx"] Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.896261 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:21 crc kubenswrapper[4762]: I0217 14:18:21.934406 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx"] Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.341429 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.342490 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d126b4fc-9d8e-4886-8f76-53268a51258b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r\" (UID: \"d126b4fc-9d8e-4886-8f76-53268a51258b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.342552 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d126b4fc-9d8e-4886-8f76-53268a51258b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r\" (UID: \"d126b4fc-9d8e-4886-8f76-53268a51258b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.342577 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77607659-a202-47d9-8358-aa339e9ce99d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx\" (UID: \"77607659-a202-47d9-8358-aa339e9ce99d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.342618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77607659-a202-47d9-8358-aa339e9ce99d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx\" (UID: \"77607659-a202-47d9-8358-aa339e9ce99d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.408908 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(e1e9134a1d2c242eed0fe9be4c55cf026a035a2fc2b467a1158cfc5cd643241d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.409200 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(e1e9134a1d2c242eed0fe9be4c55cf026a035a2fc2b467a1158cfc5cd643241d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.409226 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(e1e9134a1d2c242eed0fe9be4c55cf026a035a2fc2b467a1158cfc5cd643241d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.409271 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators(d135e9df-e707-48e4-a0ad-0d400cb5b0c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators(d135e9df-e707-48e4-a0ad-0d400cb5b0c8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(e1e9134a1d2c242eed0fe9be4c55cf026a035a2fc2b467a1158cfc5cd643241d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" podUID="d135e9df-e707-48e4-a0ad-0d400cb5b0c8" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.443988 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d126b4fc-9d8e-4886-8f76-53268a51258b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r\" (UID: \"d126b4fc-9d8e-4886-8f76-53268a51258b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.444097 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d126b4fc-9d8e-4886-8f76-53268a51258b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r\" (UID: \"d126b4fc-9d8e-4886-8f76-53268a51258b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.444129 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77607659-a202-47d9-8358-aa339e9ce99d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx\" (UID: \"77607659-a202-47d9-8358-aa339e9ce99d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.444172 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77607659-a202-47d9-8358-aa339e9ce99d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx\" (UID: \"77607659-a202-47d9-8358-aa339e9ce99d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.448680 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d126b4fc-9d8e-4886-8f76-53268a51258b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r\" (UID: \"d126b4fc-9d8e-4886-8f76-53268a51258b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.449990 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d126b4fc-9d8e-4886-8f76-53268a51258b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r\" (UID: \"d126b4fc-9d8e-4886-8f76-53268a51258b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.456114 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77607659-a202-47d9-8358-aa339e9ce99d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx\" (UID: \"77607659-a202-47d9-8358-aa339e9ce99d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.493561 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.495326 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77607659-a202-47d9-8358-aa339e9ce99d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx\" (UID: \"77607659-a202-47d9-8358-aa339e9ce99d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.528911 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(9bafe9470d2c634970fdb643b65c5f7867b5153d529638e57b92f7f4b9bf7e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.528993 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(9bafe9470d2c634970fdb643b65c5f7867b5153d529638e57b92f7f4b9bf7e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.529022 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(9bafe9470d2c634970fdb643b65c5f7867b5153d529638e57b92f7f4b9bf7e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.529082 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators(d126b4fc-9d8e-4886-8f76-53268a51258b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators(d126b4fc-9d8e-4886-8f76-53268a51258b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(9bafe9470d2c634970fdb643b65c5f7867b5153d529638e57b92f7f4b9bf7e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" podUID="d126b4fc-9d8e-4886-8f76-53268a51258b" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.540966 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fb6t4"] Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.541118 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.541947 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.545085 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d34e0ae-c3d1-4d05-8a59-ca531de00d98-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fb6t4\" (UID: \"5d34e0ae-c3d1-4d05-8a59-ca531de00d98\") " pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.545161 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42wh6\" (UniqueName: \"kubernetes.io/projected/5d34e0ae-c3d1-4d05-8a59-ca531de00d98-kube-api-access-42wh6\") pod \"observability-operator-59bdc8b94-fb6t4\" (UID: \"5d34e0ae-c3d1-4d05-8a59-ca531de00d98\") " pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.550952 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.551101 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-zfgcs" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.578387 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fb6t4"] Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.583746 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(78e7e11e26a8a369d16b2eb079c874802e3e9dd27dc5eb21e060c2590fbc0e6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.583828 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(78e7e11e26a8a369d16b2eb079c874802e3e9dd27dc5eb21e060c2590fbc0e6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.583863 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(78e7e11e26a8a369d16b2eb079c874802e3e9dd27dc5eb21e060c2590fbc0e6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.583916 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators(77607659-a202-47d9-8358-aa339e9ce99d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators(77607659-a202-47d9-8358-aa339e9ce99d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(78e7e11e26a8a369d16b2eb079c874802e3e9dd27dc5eb21e060c2590fbc0e6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" podUID="77607659-a202-47d9-8358-aa339e9ce99d" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.650774 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42wh6\" (UniqueName: \"kubernetes.io/projected/5d34e0ae-c3d1-4d05-8a59-ca531de00d98-kube-api-access-42wh6\") pod \"observability-operator-59bdc8b94-fb6t4\" (UID: \"5d34e0ae-c3d1-4d05-8a59-ca531de00d98\") " pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.650867 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d34e0ae-c3d1-4d05-8a59-ca531de00d98-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fb6t4\" (UID: \"5d34e0ae-c3d1-4d05-8a59-ca531de00d98\") " pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.655074 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d34e0ae-c3d1-4d05-8a59-ca531de00d98-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fb6t4\" (UID: \"5d34e0ae-c3d1-4d05-8a59-ca531de00d98\") " pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.678397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42wh6\" (UniqueName: \"kubernetes.io/projected/5d34e0ae-c3d1-4d05-8a59-ca531de00d98-kube-api-access-42wh6\") pod \"observability-operator-59bdc8b94-fb6t4\" (UID: \"5d34e0ae-c3d1-4d05-8a59-ca531de00d98\") " pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.743422 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-788lp"] Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.744458 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.748233 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-j5zn4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.751695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9decd9a9-2c51-42dc-8fed-78efbe4c828e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-788lp\" (UID: \"9decd9a9-2c51-42dc-8fed-78efbe4c828e\") " pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.751774 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5d2\" (UniqueName: \"kubernetes.io/projected/9decd9a9-2c51-42dc-8fed-78efbe4c828e-kube-api-access-8q5d2\") pod \"perses-operator-5bf474d74f-788lp\" (UID: \"9decd9a9-2c51-42dc-8fed-78efbe4c828e\") " pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.754127 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-788lp"] Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.853886 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5d2\" (UniqueName: \"kubernetes.io/projected/9decd9a9-2c51-42dc-8fed-78efbe4c828e-kube-api-access-8q5d2\") pod \"perses-operator-5bf474d74f-788lp\" (UID: \"9decd9a9-2c51-42dc-8fed-78efbe4c828e\") " pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.854035 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9decd9a9-2c51-42dc-8fed-78efbe4c828e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-788lp\" (UID: \"9decd9a9-2c51-42dc-8fed-78efbe4c828e\") " pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.855330 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9decd9a9-2c51-42dc-8fed-78efbe4c828e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-788lp\" (UID: \"9decd9a9-2c51-42dc-8fed-78efbe4c828e\") " pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.866399 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.880379 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5d2\" (UniqueName: \"kubernetes.io/projected/9decd9a9-2c51-42dc-8fed-78efbe4c828e-kube-api-access-8q5d2\") pod \"perses-operator-5bf474d74f-788lp\" (UID: \"9decd9a9-2c51-42dc-8fed-78efbe4c828e\") " pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.897378 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(38230ab6e16a91eaf85eb7d0eb1e1c4f2d84ddc4649bee09218bfcd5c886cbfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.897445 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(38230ab6e16a91eaf85eb7d0eb1e1c4f2d84ddc4649bee09218bfcd5c886cbfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.897476 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(38230ab6e16a91eaf85eb7d0eb1e1c4f2d84ddc4649bee09218bfcd5c886cbfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: E0217 14:18:22.897527 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-fb6t4_openshift-operators(5d34e0ae-c3d1-4d05-8a59-ca531de00d98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-fb6t4_openshift-operators(5d34e0ae-c3d1-4d05-8a59-ca531de00d98)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(38230ab6e16a91eaf85eb7d0eb1e1c4f2d84ddc4649bee09218bfcd5c886cbfb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" podUID="5d34e0ae-c3d1-4d05-8a59-ca531de00d98" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.938877 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.939518 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.939971 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.940272 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.940591 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.940886 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.941144 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:22 crc kubenswrapper[4762]: I0217 14:18:22.941417 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:23 crc kubenswrapper[4762]: I0217 14:18:23.069001 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.069488 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(4323063e06fac69e00f980b3d7d484c432f5967b0712b71c7a77f9200444481b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.069538 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(4323063e06fac69e00f980b3d7d484c432f5967b0712b71c7a77f9200444481b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.069562 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(4323063e06fac69e00f980b3d7d484c432f5967b0712b71c7a77f9200444481b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.069603 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators(77607659-a202-47d9-8358-aa339e9ce99d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators(77607659-a202-47d9-8358-aa339e9ce99d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_openshift-operators_77607659-a202-47d9-8358-aa339e9ce99d_0(4323063e06fac69e00f980b3d7d484c432f5967b0712b71c7a77f9200444481b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" podUID="77607659-a202-47d9-8358-aa339e9ce99d" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.085818 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(fab1b7ef51184e1233ab18032ef808c7545037529065d8cdfb166fe2cbe12eb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.085904 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(fab1b7ef51184e1233ab18032ef808c7545037529065d8cdfb166fe2cbe12eb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.085930 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(fab1b7ef51184e1233ab18032ef808c7545037529065d8cdfb166fe2cbe12eb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.085975 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators(d135e9df-e707-48e4-a0ad-0d400cb5b0c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators(d135e9df-e707-48e4-a0ad-0d400cb5b0c8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-csbmw_openshift-operators_d135e9df-e707-48e4-a0ad-0d400cb5b0c8_0(fab1b7ef51184e1233ab18032ef808c7545037529065d8cdfb166fe2cbe12eb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" podUID="d135e9df-e707-48e4-a0ad-0d400cb5b0c8" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.093884 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(72a7e2280dc61fd7874f14c049b2f9d1ec1e1817250638dfa3dbe196488dcc36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.093943 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(72a7e2280dc61fd7874f14c049b2f9d1ec1e1817250638dfa3dbe196488dcc36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.093963 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(72a7e2280dc61fd7874f14c049b2f9d1ec1e1817250638dfa3dbe196488dcc36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.094005 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators(d126b4fc-9d8e-4886-8f76-53268a51258b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators(d126b4fc-9d8e-4886-8f76-53268a51258b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_openshift-operators_d126b4fc-9d8e-4886-8f76-53268a51258b_0(72a7e2280dc61fd7874f14c049b2f9d1ec1e1817250638dfa3dbe196488dcc36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" podUID="d126b4fc-9d8e-4886-8f76-53268a51258b" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.113027 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(f917f4fa4b94b1456b5dbf1c252c8265963c1500137d2a8238d6486bf9cf6390): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.113088 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(f917f4fa4b94b1456b5dbf1c252c8265963c1500137d2a8238d6486bf9cf6390): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.113109 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(f917f4fa4b94b1456b5dbf1c252c8265963c1500137d2a8238d6486bf9cf6390): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.113146 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-fb6t4_openshift-operators(5d34e0ae-c3d1-4d05-8a59-ca531de00d98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-fb6t4_openshift-operators(5d34e0ae-c3d1-4d05-8a59-ca531de00d98)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-fb6t4_openshift-operators_5d34e0ae-c3d1-4d05-8a59-ca531de00d98_0(f917f4fa4b94b1456b5dbf1c252c8265963c1500137d2a8238d6486bf9cf6390): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" podUID="5d34e0ae-c3d1-4d05-8a59-ca531de00d98" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.116922 4762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-788lp_openshift-operators_9decd9a9-2c51-42dc-8fed-78efbe4c828e_0(a8be6d3e2037c21c5041a6358cbbb491d0811dd2dd363585e468da7f689e33be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.117008 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-788lp_openshift-operators_9decd9a9-2c51-42dc-8fed-78efbe4c828e_0(a8be6d3e2037c21c5041a6358cbbb491d0811dd2dd363585e468da7f689e33be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.117037 4762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-788lp_openshift-operators_9decd9a9-2c51-42dc-8fed-78efbe4c828e_0(a8be6d3e2037c21c5041a6358cbbb491d0811dd2dd363585e468da7f689e33be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:23 crc kubenswrapper[4762]: E0217 14:18:23.117084 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-788lp_openshift-operators(9decd9a9-2c51-42dc-8fed-78efbe4c828e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-788lp_openshift-operators(9decd9a9-2c51-42dc-8fed-78efbe4c828e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-788lp_openshift-operators_9decd9a9-2c51-42dc-8fed-78efbe4c828e_0(a8be6d3e2037c21c5041a6358cbbb491d0811dd2dd363585e468da7f689e33be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-788lp" podUID="9decd9a9-2c51-42dc-8fed-78efbe4c828e" Feb 17 14:18:23 crc kubenswrapper[4762]: I0217 14:18:23.944087 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:23 crc kubenswrapper[4762]: I0217 14:18:23.944885 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.621328 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.621609 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.621684 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.622325 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"116572c4d79b2feaa81621e7ad3ce8410516799fe8d9dbdb26dfeae29390b841"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.622395 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://116572c4d79b2feaa81621e7ad3ce8410516799fe8d9dbdb26dfeae29390b841" gracePeriod=600 Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.659572 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-788lp"] Feb 17 14:18:24 crc kubenswrapper[4762]: W0217 14:18:24.664444 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9decd9a9_2c51_42dc_8fed_78efbe4c828e.slice/crio-a360403ec2a12decfad0d76949db4a3b3954511e8b37eb197fb6ee7717366a35 WatchSource:0}: Error finding container a360403ec2a12decfad0d76949db4a3b3954511e8b37eb197fb6ee7717366a35: Status 404 returned error can't find the container with id a360403ec2a12decfad0d76949db4a3b3954511e8b37eb197fb6ee7717366a35 Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.952356 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="116572c4d79b2feaa81621e7ad3ce8410516799fe8d9dbdb26dfeae29390b841" exitCode=0 Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.952422 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"116572c4d79b2feaa81621e7ad3ce8410516799fe8d9dbdb26dfeae29390b841"} Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.952454 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"a30a93d238cea1f8adefd72afd175112649379fa52475b885f21fda62dbe2cba"} Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.952474 4762 scope.go:117] "RemoveContainer" containerID="817296b81932e51cfaf5f5110e46a8a500731db1cf4d8ef393c04d896b5ebe8b" Feb 17 14:18:24 crc kubenswrapper[4762]: I0217 14:18:24.954123 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-788lp" event={"ID":"9decd9a9-2c51-42dc-8fed-78efbe4c828e","Type":"ContainerStarted","Data":"a360403ec2a12decfad0d76949db4a3b3954511e8b37eb197fb6ee7717366a35"} Feb 17 14:18:27 crc kubenswrapper[4762]: I0217 14:18:27.034226 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:18:27 crc kubenswrapper[4762]: I0217 14:18:27.088474 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:18:27 crc kubenswrapper[4762]: I0217 14:18:27.805168 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvswd"] Feb 17 14:18:28 crc kubenswrapper[4762]: I0217 14:18:28.992260 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvswd" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="registry-server" containerID="cri-o://c95358a747f73f9ccb6c3c907bbecf3dcf957be91c2207225a5df83b9cd4b5e4" gracePeriod=2 Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.013545 4762 generic.go:334] "Generic (PLEG): container finished" podID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerID="c95358a747f73f9ccb6c3c907bbecf3dcf957be91c2207225a5df83b9cd4b5e4" exitCode=0 Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.013603 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerDied","Data":"c95358a747f73f9ccb6c3c907bbecf3dcf957be91c2207225a5df83b9cd4b5e4"} Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.802730 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.907394 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-utilities\") pod \"99fa7921-3767-449e-a15c-cfb265cd16a2\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.907988 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-catalog-content\") pod \"99fa7921-3767-449e-a15c-cfb265cd16a2\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.908100 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tss9n\" (UniqueName: \"kubernetes.io/projected/99fa7921-3767-449e-a15c-cfb265cd16a2-kube-api-access-tss9n\") pod \"99fa7921-3767-449e-a15c-cfb265cd16a2\" (UID: \"99fa7921-3767-449e-a15c-cfb265cd16a2\") " Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.908318 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-utilities" (OuterVolumeSpecName: "utilities") pod "99fa7921-3767-449e-a15c-cfb265cd16a2" (UID: "99fa7921-3767-449e-a15c-cfb265cd16a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.908691 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:30 crc kubenswrapper[4762]: I0217 14:18:30.915103 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fa7921-3767-449e-a15c-cfb265cd16a2-kube-api-access-tss9n" (OuterVolumeSpecName: "kube-api-access-tss9n") pod "99fa7921-3767-449e-a15c-cfb265cd16a2" (UID: "99fa7921-3767-449e-a15c-cfb265cd16a2"). InnerVolumeSpecName "kube-api-access-tss9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.009608 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tss9n\" (UniqueName: \"kubernetes.io/projected/99fa7921-3767-449e-a15c-cfb265cd16a2-kube-api-access-tss9n\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.021297 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvswd" event={"ID":"99fa7921-3767-449e-a15c-cfb265cd16a2","Type":"ContainerDied","Data":"7734a9cbfe688e81763ba15e89047c2b00defa3b4daf7604134229846ce6a2dd"} Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.021361 4762 scope.go:117] "RemoveContainer" containerID="c95358a747f73f9ccb6c3c907bbecf3dcf957be91c2207225a5df83b9cd4b5e4" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.021503 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvswd" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.036621 4762 scope.go:117] "RemoveContainer" containerID="ec5888efafd82f4032e941295a8b012724d6bdad845cb96c969c39b6142b8a56" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.037239 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-788lp" event={"ID":"9decd9a9-2c51-42dc-8fed-78efbe4c828e","Type":"ContainerStarted","Data":"fd713bb265fd0c00cdaeed0cea0427a7f97d89779da93af6389f4fbdfb76a7b3"} Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.037693 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.046050 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99fa7921-3767-449e-a15c-cfb265cd16a2" (UID: "99fa7921-3767-449e-a15c-cfb265cd16a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.068892 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-788lp" podStartSLOduration=2.9182642899999998 podStartE2EDuration="9.06887279s" podCreationTimestamp="2026-02-17 14:18:22 +0000 UTC" firstStartedPulling="2026-02-17 14:18:24.667206557 +0000 UTC m=+785.247207209" lastFinishedPulling="2026-02-17 14:18:30.817815057 +0000 UTC m=+791.397815709" observedRunningTime="2026-02-17 14:18:31.064497451 +0000 UTC m=+791.644498103" watchObservedRunningTime="2026-02-17 14:18:31.06887279 +0000 UTC m=+791.648873452" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.086327 4762 scope.go:117] "RemoveContainer" containerID="c6332179c33b0fd35f102a359076f9b54e2ffcfdc44120325909f67486948bdc" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.110547 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fa7921-3767-449e-a15c-cfb265cd16a2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.354856 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvswd"] Feb 17 14:18:31 crc kubenswrapper[4762]: I0217 14:18:31.359470 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvswd"] Feb 17 14:18:32 crc kubenswrapper[4762]: I0217 14:18:32.094707 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" path="/var/lib/kubelet/pods/99fa7921-3767-449e-a15c-cfb265cd16a2/volumes" Feb 17 14:18:34 crc kubenswrapper[4762]: I0217 14:18:34.070513 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:34 crc kubenswrapper[4762]: I0217 14:18:34.070540 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:34 crc kubenswrapper[4762]: I0217 14:18:34.071240 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" Feb 17 14:18:34 crc kubenswrapper[4762]: I0217 14:18:34.071256 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" Feb 17 14:18:34 crc kubenswrapper[4762]: I0217 14:18:34.498577 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw"] Feb 17 14:18:34 crc kubenswrapper[4762]: I0217 14:18:34.596560 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx"] Feb 17 14:18:34 crc kubenswrapper[4762]: W0217 14:18:34.604525 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77607659_a202_47d9_8358_aa339e9ce99d.slice/crio-0ef2decfee41007fcf3b1de768f4fd76b9eec1516d266b661764816d5e6e0fe9 WatchSource:0}: Error finding container 0ef2decfee41007fcf3b1de768f4fd76b9eec1516d266b661764816d5e6e0fe9: Status 404 returned error can't find the container with id 0ef2decfee41007fcf3b1de768f4fd76b9eec1516d266b661764816d5e6e0fe9 Feb 17 14:18:35 crc kubenswrapper[4762]: I0217 14:18:35.061408 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" event={"ID":"77607659-a202-47d9-8358-aa339e9ce99d","Type":"ContainerStarted","Data":"0ef2decfee41007fcf3b1de768f4fd76b9eec1516d266b661764816d5e6e0fe9"} Feb 17 14:18:35 crc kubenswrapper[4762]: I0217 14:18:35.062779 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" event={"ID":"d135e9df-e707-48e4-a0ad-0d400cb5b0c8","Type":"ContainerStarted","Data":"98fdd032d8ea424a0f50dc6523be637ced594c5c2d249297b6946baed0ea1c0e"} Feb 17 14:18:35 crc kubenswrapper[4762]: I0217 14:18:35.073548 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:35 crc kubenswrapper[4762]: I0217 14:18:35.074016 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:35 crc kubenswrapper[4762]: I0217 14:18:35.359556 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fb6t4"] Feb 17 14:18:35 crc kubenswrapper[4762]: W0217 14:18:35.369294 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d34e0ae_c3d1_4d05_8a59_ca531de00d98.slice/crio-a750846a52b4cf0860802b981b26af0671778c52c857cd3988add35f348e571a WatchSource:0}: Error finding container a750846a52b4cf0860802b981b26af0671778c52c857cd3988add35f348e571a: Status 404 returned error can't find the container with id a750846a52b4cf0860802b981b26af0671778c52c857cd3988add35f348e571a Feb 17 14:18:36 crc kubenswrapper[4762]: I0217 14:18:36.068802 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" event={"ID":"5d34e0ae-c3d1-4d05-8a59-ca531de00d98","Type":"ContainerStarted","Data":"a750846a52b4cf0860802b981b26af0671778c52c857cd3988add35f348e571a"} Feb 17 14:18:36 crc kubenswrapper[4762]: I0217 14:18:36.070192 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:36 crc kubenswrapper[4762]: I0217 14:18:36.070810 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" Feb 17 14:18:37 crc kubenswrapper[4762]: I0217 14:18:37.254594 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kzpnp" Feb 17 14:18:38 crc kubenswrapper[4762]: I0217 14:18:38.089454 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" event={"ID":"77607659-a202-47d9-8358-aa339e9ce99d","Type":"ContainerStarted","Data":"5fc14f5dc2915a930661eb56de445f1044e4697dc3a36825341939008f065d8b"} Feb 17 14:18:38 crc kubenswrapper[4762]: I0217 14:18:38.121105 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-xgzjx" podStartSLOduration=13.874931962 podStartE2EDuration="17.121076228s" podCreationTimestamp="2026-02-17 14:18:21 +0000 UTC" firstStartedPulling="2026-02-17 14:18:34.607237084 +0000 UTC m=+795.187237726" lastFinishedPulling="2026-02-17 14:18:37.85338134 +0000 UTC m=+798.433381992" observedRunningTime="2026-02-17 14:18:38.109617685 +0000 UTC m=+798.689618337" watchObservedRunningTime="2026-02-17 14:18:38.121076228 +0000 UTC m=+798.701076880" Feb 17 14:18:38 crc kubenswrapper[4762]: I0217 14:18:38.310346 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r"] Feb 17 14:18:39 crc kubenswrapper[4762]: I0217 14:18:39.098353 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" event={"ID":"d126b4fc-9d8e-4886-8f76-53268a51258b","Type":"ContainerStarted","Data":"a31e5fce83c35c4e03bb7cfb5beafaf14233a6ef2f2ca6bd521205be3e07065c"} Feb 17 14:18:39 crc kubenswrapper[4762]: I0217 14:18:39.098720 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" event={"ID":"d126b4fc-9d8e-4886-8f76-53268a51258b","Type":"ContainerStarted","Data":"9f9b2a306ec0cc1dba9110cffbae531f799bb8cbeee23811a35f2c99faec7e3b"} Feb 17 14:18:39 crc kubenswrapper[4762]: I0217 14:18:39.102088 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" event={"ID":"d135e9df-e707-48e4-a0ad-0d400cb5b0c8","Type":"ContainerStarted","Data":"a59e9684c96a011f4fe922788eb801877a478ae7c83caebbcf71fab99ee65357"} Feb 17 14:18:39 crc kubenswrapper[4762]: I0217 14:18:39.128580 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86644c88f-l5r9r" podStartSLOduration=18.128555812 podStartE2EDuration="18.128555812s" podCreationTimestamp="2026-02-17 14:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:18:39.120497391 +0000 UTC m=+799.700498043" watchObservedRunningTime="2026-02-17 14:18:39.128555812 +0000 UTC m=+799.708556464" Feb 17 14:18:39 crc kubenswrapper[4762]: I0217 14:18:39.178769 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbmw" podStartSLOduration=14.832422719 podStartE2EDuration="18.178749294s" podCreationTimestamp="2026-02-17 14:18:21 +0000 UTC" firstStartedPulling="2026-02-17 14:18:34.523882305 +0000 UTC m=+795.103882957" lastFinishedPulling="2026-02-17 14:18:37.87020888 +0000 UTC m=+798.450209532" observedRunningTime="2026-02-17 14:18:39.174761715 +0000 UTC m=+799.754762387" watchObservedRunningTime="2026-02-17 14:18:39.178749294 +0000 UTC m=+799.758749936" Feb 17 14:18:43 crc kubenswrapper[4762]: I0217 14:18:43.073910 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-788lp" Feb 17 14:18:46 crc kubenswrapper[4762]: I0217 14:18:46.162458 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" event={"ID":"5d34e0ae-c3d1-4d05-8a59-ca531de00d98","Type":"ContainerStarted","Data":"0194b9539365331c99b2393044800b43f8d8650ab93fbbe819c12d816667b49c"} Feb 17 14:18:46 crc kubenswrapper[4762]: I0217 14:18:46.163067 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:46 crc kubenswrapper[4762]: I0217 14:18:46.263516 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" podStartSLOduration=14.466144425 podStartE2EDuration="24.263490881s" podCreationTimestamp="2026-02-17 14:18:22 +0000 UTC" firstStartedPulling="2026-02-17 14:18:35.373707709 +0000 UTC m=+795.953708361" lastFinishedPulling="2026-02-17 14:18:45.171054165 +0000 UTC m=+805.751054817" observedRunningTime="2026-02-17 14:18:46.182783644 +0000 UTC m=+806.762784296" watchObservedRunningTime="2026-02-17 14:18:46.263490881 +0000 UTC m=+806.843491533" Feb 17 14:18:46 crc kubenswrapper[4762]: I0217 14:18:46.303945 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-fb6t4" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.576462 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-27rxl"] Feb 17 14:18:56 crc kubenswrapper[4762]: E0217 14:18:56.577316 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="extract-utilities" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.577336 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="extract-utilities" Feb 17 14:18:56 crc kubenswrapper[4762]: E0217 14:18:56.577348 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="extract-content" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.577355 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="extract-content" Feb 17 14:18:56 crc kubenswrapper[4762]: E0217 14:18:56.577390 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="registry-server" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.577399 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="registry-server" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.577567 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fa7921-3767-449e-a15c-cfb265cd16a2" containerName="registry-server" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.578159 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.597531 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-5fk9z"] Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.598761 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.598986 4762 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bzgbh" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.599743 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-5fk9z" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.826201 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.827741 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7h6b\" (UniqueName: \"kubernetes.io/projected/2dd817de-0e2d-40fe-ba7d-036a6e1247dd-kube-api-access-c7h6b\") pod \"cert-manager-cainjector-cf98fcc89-27rxl\" (UID: \"2dd817de-0e2d-40fe-ba7d-036a6e1247dd\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.827824 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4zr\" (UniqueName: \"kubernetes.io/projected/24448600-d00c-44b6-a1d9-08ce0d5cd43c-kube-api-access-dw4zr\") pod \"cert-manager-858654f9db-5fk9z\" (UID: \"24448600-d00c-44b6-a1d9-08ce0d5cd43c\") " pod="cert-manager/cert-manager-858654f9db-5fk9z" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.837808 4762 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-w8hdq" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.847412 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-27rxl"] Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.851758 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-5fk9z"] Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.878030 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dpg84"] Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.878993 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.883997 4762 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pzcj6" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.888085 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dpg84"] Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.929928 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7h6b\" (UniqueName: \"kubernetes.io/projected/2dd817de-0e2d-40fe-ba7d-036a6e1247dd-kube-api-access-c7h6b\") pod \"cert-manager-cainjector-cf98fcc89-27rxl\" (UID: \"2dd817de-0e2d-40fe-ba7d-036a6e1247dd\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.929993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4zr\" (UniqueName: \"kubernetes.io/projected/24448600-d00c-44b6-a1d9-08ce0d5cd43c-kube-api-access-dw4zr\") pod \"cert-manager-858654f9db-5fk9z\" (UID: \"24448600-d00c-44b6-a1d9-08ce0d5cd43c\") " pod="cert-manager/cert-manager-858654f9db-5fk9z" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.950096 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7h6b\" (UniqueName: \"kubernetes.io/projected/2dd817de-0e2d-40fe-ba7d-036a6e1247dd-kube-api-access-c7h6b\") pod \"cert-manager-cainjector-cf98fcc89-27rxl\" (UID: \"2dd817de-0e2d-40fe-ba7d-036a6e1247dd\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" Feb 17 14:18:56 crc kubenswrapper[4762]: I0217 14:18:56.950232 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4zr\" (UniqueName: \"kubernetes.io/projected/24448600-d00c-44b6-a1d9-08ce0d5cd43c-kube-api-access-dw4zr\") pod \"cert-manager-858654f9db-5fk9z\" (UID: \"24448600-d00c-44b6-a1d9-08ce0d5cd43c\") " pod="cert-manager/cert-manager-858654f9db-5fk9z" Feb 17 14:18:57 crc kubenswrapper[4762]: I0217 14:18:57.031560 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44c84\" (UniqueName: \"kubernetes.io/projected/9233ba97-592c-4c1d-9326-c726d6d43f12-kube-api-access-44c84\") pod \"cert-manager-webhook-687f57d79b-dpg84\" (UID: \"9233ba97-592c-4c1d-9326-c726d6d43f12\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:18:57 crc kubenswrapper[4762]: I0217 14:18:57.133493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44c84\" (UniqueName: \"kubernetes.io/projected/9233ba97-592c-4c1d-9326-c726d6d43f12-kube-api-access-44c84\") pod \"cert-manager-webhook-687f57d79b-dpg84\" (UID: \"9233ba97-592c-4c1d-9326-c726d6d43f12\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:18:57 crc kubenswrapper[4762]: I0217 14:18:57.150468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44c84\" (UniqueName: \"kubernetes.io/projected/9233ba97-592c-4c1d-9326-c726d6d43f12-kube-api-access-44c84\") pod \"cert-manager-webhook-687f57d79b-dpg84\" (UID: \"9233ba97-592c-4c1d-9326-c726d6d43f12\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:18:57 crc kubenswrapper[4762]: I0217 14:18:57.195373 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:18:57 crc kubenswrapper[4762]: I0217 14:18:57.199208 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" Feb 17 14:18:57 crc kubenswrapper[4762]: I0217 14:18:57.223460 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-5fk9z" Feb 17 14:18:58 crc kubenswrapper[4762]: I0217 14:18:58.102271 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dpg84"] Feb 17 14:18:58 crc kubenswrapper[4762]: I0217 14:18:58.168141 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-27rxl"] Feb 17 14:18:58 crc kubenswrapper[4762]: I0217 14:18:58.494637 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-5fk9z"] Feb 17 14:18:59 crc kubenswrapper[4762]: I0217 14:18:59.023384 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" event={"ID":"2dd817de-0e2d-40fe-ba7d-036a6e1247dd","Type":"ContainerStarted","Data":"bb33c69383f39a4f36c080d4a0f34abedea9c73804e55d131257aa0f5926f1ba"} Feb 17 14:18:59 crc kubenswrapper[4762]: I0217 14:18:59.024296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" event={"ID":"9233ba97-592c-4c1d-9326-c726d6d43f12","Type":"ContainerStarted","Data":"3dd17b525ed0ab72005412bb52cc1da8aeda3b8c64e1c8e10932ce9fb013130c"} Feb 17 14:18:59 crc kubenswrapper[4762]: I0217 14:18:59.025442 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-5fk9z" event={"ID":"24448600-d00c-44b6-a1d9-08ce0d5cd43c","Type":"ContainerStarted","Data":"03686cc4c673b0218b7de8ab46b86cf3a9bd78f1508106f70f5056630e7fd56c"} Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.102110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" event={"ID":"9233ba97-592c-4c1d-9326-c726d6d43f12","Type":"ContainerStarted","Data":"9c9d0816d81806a95784dc827256addd793d5ca8c99e3c23549c1b7bdf44dde2"} Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.105872 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.108416 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-5fk9z" event={"ID":"24448600-d00c-44b6-a1d9-08ce0d5cd43c","Type":"ContainerStarted","Data":"c8ee6f28603ec36f6cb191a611c699a3e2cead728b74ff09ab3098207c2458d9"} Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.112430 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" event={"ID":"2dd817de-0e2d-40fe-ba7d-036a6e1247dd","Type":"ContainerStarted","Data":"9de45ab53656222c20dda3d617fe5572d958fb5c84551a4b26e51500965c52a3"} Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.145851 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" podStartSLOduration=3.092150462 podStartE2EDuration="8.145818903s" podCreationTimestamp="2026-02-17 14:18:56 +0000 UTC" firstStartedPulling="2026-02-17 14:18:58.130251915 +0000 UTC m=+818.710252567" lastFinishedPulling="2026-02-17 14:19:03.183920356 +0000 UTC m=+823.763921008" observedRunningTime="2026-02-17 14:19:04.142078631 +0000 UTC m=+824.722079283" watchObservedRunningTime="2026-02-17 14:19:04.145818903 +0000 UTC m=+824.725819555" Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.214073 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-5fk9z" podStartSLOduration=3.374215713 podStartE2EDuration="8.214044438s" podCreationTimestamp="2026-02-17 14:18:56 +0000 UTC" firstStartedPulling="2026-02-17 14:18:58.500673932 +0000 UTC m=+819.080674594" lastFinishedPulling="2026-02-17 14:19:03.340502667 +0000 UTC m=+823.920503319" observedRunningTime="2026-02-17 14:19:04.16984897 +0000 UTC m=+824.749849622" watchObservedRunningTime="2026-02-17 14:19:04.214044438 +0000 UTC m=+824.794045100" Feb 17 14:19:04 crc kubenswrapper[4762]: I0217 14:19:04.222328 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-27rxl" podStartSLOduration=3.128810194 podStartE2EDuration="8.222293524s" podCreationTimestamp="2026-02-17 14:18:56 +0000 UTC" firstStartedPulling="2026-02-17 14:18:58.189640158 +0000 UTC m=+818.769640810" lastFinishedPulling="2026-02-17 14:19:03.283123488 +0000 UTC m=+823.863124140" observedRunningTime="2026-02-17 14:19:04.218812799 +0000 UTC m=+824.798813461" watchObservedRunningTime="2026-02-17 14:19:04.222293524 +0000 UTC m=+824.802294176" Feb 17 14:19:12 crc kubenswrapper[4762]: I0217 14:19:12.198898 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-dpg84" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.014330 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q"] Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.022023 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.032816 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q"] Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.035753 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.098368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2lp\" (UniqueName: \"kubernetes.io/projected/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-kube-api-access-8n2lp\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.098767 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.098809 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.200139 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.200261 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2lp\" (UniqueName: \"kubernetes.io/projected/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-kube-api-access-8n2lp\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.200386 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.201906 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.202564 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.227621 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2lp\" (UniqueName: \"kubernetes.io/projected/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-kube-api-access-8n2lp\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.238272 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld"] Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.239547 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.261861 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld"] Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.301942 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.302115 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.302235 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzbh\" (UniqueName: \"kubernetes.io/projected/0b88810f-7e51-448f-91a4-327a41a07307-kube-api-access-ngzbh\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.424701 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.425199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.425302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzbh\" (UniqueName: \"kubernetes.io/projected/0b88810f-7e51-448f-91a4-327a41a07307-kube-api-access-ngzbh\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.425378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.425808 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.425856 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.447887 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzbh\" (UniqueName: \"kubernetes.io/projected/0b88810f-7e51-448f-91a4-327a41a07307-kube-api-access-ngzbh\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.563057 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.672416 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q"] Feb 17 14:19:43 crc kubenswrapper[4762]: I0217 14:19:43.820574 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld"] Feb 17 14:19:43 crc kubenswrapper[4762]: W0217 14:19:43.832453 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b88810f_7e51_448f_91a4_327a41a07307.slice/crio-06df20f37f0f450b3492bce65e061d1157d962dccec5d3af579ffba8ac203a5d WatchSource:0}: Error finding container 06df20f37f0f450b3492bce65e061d1157d962dccec5d3af579ffba8ac203a5d: Status 404 returned error can't find the container with id 06df20f37f0f450b3492bce65e061d1157d962dccec5d3af579ffba8ac203a5d Feb 17 14:19:44 crc kubenswrapper[4762]: I0217 14:19:44.364179 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerID="471eefd0ee0ae74c87a198403c5efc0bbe6a57a9023268df0ab593675645765c" exitCode=0 Feb 17 14:19:44 crc kubenswrapper[4762]: I0217 14:19:44.364344 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" event={"ID":"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5","Type":"ContainerDied","Data":"471eefd0ee0ae74c87a198403c5efc0bbe6a57a9023268df0ab593675645765c"} Feb 17 14:19:44 crc kubenswrapper[4762]: I0217 14:19:44.364487 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" event={"ID":"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5","Type":"ContainerStarted","Data":"ba7d1b24f7e4e8f50ca35e4685b01e7afe2caa015cf89ffbc6862b78a97988c1"} Feb 17 14:19:44 crc kubenswrapper[4762]: I0217 14:19:44.366856 4762 generic.go:334] "Generic (PLEG): container finished" podID="0b88810f-7e51-448f-91a4-327a41a07307" containerID="c68001284635e8cce98e0ba286b1d88fd99c2203747247a9815d5dd3cbb0820b" exitCode=0 Feb 17 14:19:44 crc kubenswrapper[4762]: I0217 14:19:44.366902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" event={"ID":"0b88810f-7e51-448f-91a4-327a41a07307","Type":"ContainerDied","Data":"c68001284635e8cce98e0ba286b1d88fd99c2203747247a9815d5dd3cbb0820b"} Feb 17 14:19:44 crc kubenswrapper[4762]: I0217 14:19:44.366934 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" event={"ID":"0b88810f-7e51-448f-91a4-327a41a07307","Type":"ContainerStarted","Data":"06df20f37f0f450b3492bce65e061d1157d962dccec5d3af579ffba8ac203a5d"} Feb 17 14:19:46 crc kubenswrapper[4762]: I0217 14:19:46.383038 4762 generic.go:334] "Generic (PLEG): container finished" podID="0b88810f-7e51-448f-91a4-327a41a07307" containerID="cd02f7b1c1ee86b10773356bb301cfa1e3fa70ec838bf2cbee64089e8bc9a386" exitCode=0 Feb 17 14:19:46 crc kubenswrapper[4762]: I0217 14:19:46.383180 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" event={"ID":"0b88810f-7e51-448f-91a4-327a41a07307","Type":"ContainerDied","Data":"cd02f7b1c1ee86b10773356bb301cfa1e3fa70ec838bf2cbee64089e8bc9a386"} Feb 17 14:19:46 crc kubenswrapper[4762]: I0217 14:19:46.386039 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" event={"ID":"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5","Type":"ContainerStarted","Data":"16e3dae0308c88a5e52ab23f87a37f7bace184d901703d1e45f0d078928e1cdd"} Feb 17 14:19:47 crc kubenswrapper[4762]: I0217 14:19:47.395618 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerID="16e3dae0308c88a5e52ab23f87a37f7bace184d901703d1e45f0d078928e1cdd" exitCode=0 Feb 17 14:19:47 crc kubenswrapper[4762]: I0217 14:19:47.395803 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" event={"ID":"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5","Type":"ContainerDied","Data":"16e3dae0308c88a5e52ab23f87a37f7bace184d901703d1e45f0d078928e1cdd"} Feb 17 14:19:47 crc kubenswrapper[4762]: I0217 14:19:47.400500 4762 generic.go:334] "Generic (PLEG): container finished" podID="0b88810f-7e51-448f-91a4-327a41a07307" containerID="0e250e0c4b73a31dedba51fede37095d442d55805fbb1cda1a5370e58a482e84" exitCode=0 Feb 17 14:19:47 crc kubenswrapper[4762]: I0217 14:19:47.400557 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" event={"ID":"0b88810f-7e51-448f-91a4-327a41a07307","Type":"ContainerDied","Data":"0e250e0c4b73a31dedba51fede37095d442d55805fbb1cda1a5370e58a482e84"} Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.411434 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerID="7823482e863965447a19247305f27175da92ed3a226f4e4400d07c17bb9549ae" exitCode=0 Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.411492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" event={"ID":"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5","Type":"ContainerDied","Data":"7823482e863965447a19247305f27175da92ed3a226f4e4400d07c17bb9549ae"} Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.696810 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.817626 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngzbh\" (UniqueName: \"kubernetes.io/projected/0b88810f-7e51-448f-91a4-327a41a07307-kube-api-access-ngzbh\") pod \"0b88810f-7e51-448f-91a4-327a41a07307\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.818058 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-bundle\") pod \"0b88810f-7e51-448f-91a4-327a41a07307\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.818164 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-util\") pod \"0b88810f-7e51-448f-91a4-327a41a07307\" (UID: \"0b88810f-7e51-448f-91a4-327a41a07307\") " Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.819117 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-bundle" (OuterVolumeSpecName: "bundle") pod "0b88810f-7e51-448f-91a4-327a41a07307" (UID: "0b88810f-7e51-448f-91a4-327a41a07307"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.822848 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b88810f-7e51-448f-91a4-327a41a07307-kube-api-access-ngzbh" (OuterVolumeSpecName: "kube-api-access-ngzbh") pod "0b88810f-7e51-448f-91a4-327a41a07307" (UID: "0b88810f-7e51-448f-91a4-327a41a07307"). InnerVolumeSpecName "kube-api-access-ngzbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.833849 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-util" (OuterVolumeSpecName: "util") pod "0b88810f-7e51-448f-91a4-327a41a07307" (UID: "0b88810f-7e51-448f-91a4-327a41a07307"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.919575 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngzbh\" (UniqueName: \"kubernetes.io/projected/0b88810f-7e51-448f-91a4-327a41a07307-kube-api-access-ngzbh\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.919619 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:48 crc kubenswrapper[4762]: I0217 14:19:48.919631 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b88810f-7e51-448f-91a4-327a41a07307-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.421760 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.421733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld" event={"ID":"0b88810f-7e51-448f-91a4-327a41a07307","Type":"ContainerDied","Data":"06df20f37f0f450b3492bce65e061d1157d962dccec5d3af579ffba8ac203a5d"} Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.421808 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06df20f37f0f450b3492bce65e061d1157d962dccec5d3af579ffba8ac203a5d" Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.906333 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.983621 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n2lp\" (UniqueName: \"kubernetes.io/projected/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-kube-api-access-8n2lp\") pod \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.983684 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-util\") pod \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.983708 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-bundle\") pod \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\" (UID: \"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5\") " Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.984915 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-bundle" (OuterVolumeSpecName: "bundle") pod "4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" (UID: "4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:19:49 crc kubenswrapper[4762]: I0217 14:19:49.987837 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-kube-api-access-8n2lp" (OuterVolumeSpecName: "kube-api-access-8n2lp") pod "4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" (UID: "4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5"). InnerVolumeSpecName "kube-api-access-8n2lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.000128 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-util" (OuterVolumeSpecName: "util") pod "4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" (UID: "4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.270550 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n2lp\" (UniqueName: \"kubernetes.io/projected/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-kube-api-access-8n2lp\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.271503 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.271597 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.431105 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" event={"ID":"4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5","Type":"ContainerDied","Data":"ba7d1b24f7e4e8f50ca35e4685b01e7afe2caa015cf89ffbc6862b78a97988c1"} Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.431171 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7d1b24f7e4e8f50ca35e4685b01e7afe2caa015cf89ffbc6862b78a97988c1" Feb 17 14:19:50 crc kubenswrapper[4762]: I0217 14:19:50.431214 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.704378 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj"] Feb 17 14:19:59 crc kubenswrapper[4762]: E0217 14:19:59.705305 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="pull" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705322 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="pull" Feb 17 14:19:59 crc kubenswrapper[4762]: E0217 14:19:59.705344 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="extract" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705352 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="extract" Feb 17 14:19:59 crc kubenswrapper[4762]: E0217 14:19:59.705364 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="util" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705373 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="util" Feb 17 14:19:59 crc kubenswrapper[4762]: E0217 14:19:59.705385 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="pull" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705392 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="pull" Feb 17 14:19:59 crc kubenswrapper[4762]: E0217 14:19:59.705405 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="util" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705413 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="util" Feb 17 14:19:59 crc kubenswrapper[4762]: E0217 14:19:59.705424 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="extract" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705432 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="extract" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705581 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b88810f-7e51-448f-91a4-327a41a07307" containerName="extract" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.705598 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5" containerName="extract" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.706456 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.708862 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.708871 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.709173 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-pg6wp" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.709243 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.709256 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.709363 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.727358 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj"] Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.800455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/425e262b-13e9-474a-85f5-1a0501569aa9-manager-config\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.800529 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-apiservice-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.800611 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.800672 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-webhook-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.800720 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jst6f\" (UniqueName: \"kubernetes.io/projected/425e262b-13e9-474a-85f5-1a0501569aa9-kube-api-access-jst6f\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.901976 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/425e262b-13e9-474a-85f5-1a0501569aa9-manager-config\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.902050 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-apiservice-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.902138 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-webhook-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.902166 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.902209 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jst6f\" (UniqueName: \"kubernetes.io/projected/425e262b-13e9-474a-85f5-1a0501569aa9-kube-api-access-jst6f\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.902997 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/425e262b-13e9-474a-85f5-1a0501569aa9-manager-config\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.909386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-apiservice-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.917357 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-webhook-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.924636 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jst6f\" (UniqueName: \"kubernetes.io/projected/425e262b-13e9-474a-85f5-1a0501569aa9-kube-api-access-jst6f\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:19:59 crc kubenswrapper[4762]: I0217 14:19:59.933466 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/425e262b-13e9-474a-85f5-1a0501569aa9-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-59cfb98864-gc6tj\" (UID: \"425e262b-13e9-474a-85f5-1a0501569aa9\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:20:00 crc kubenswrapper[4762]: I0217 14:20:00.024436 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:20:00 crc kubenswrapper[4762]: I0217 14:20:00.505784 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj"] Feb 17 14:20:00 crc kubenswrapper[4762]: W0217 14:20:00.517573 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod425e262b_13e9_474a_85f5_1a0501569aa9.slice/crio-9da11bc1ffc6082202e663ec084b13703b31be4c8ff4679b5a7451883f07c25d WatchSource:0}: Error finding container 9da11bc1ffc6082202e663ec084b13703b31be4c8ff4679b5a7451883f07c25d: Status 404 returned error can't find the container with id 9da11bc1ffc6082202e663ec084b13703b31be4c8ff4679b5a7451883f07c25d Feb 17 14:20:01 crc kubenswrapper[4762]: I0217 14:20:01.512470 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" event={"ID":"425e262b-13e9-474a-85f5-1a0501569aa9","Type":"ContainerStarted","Data":"9da11bc1ffc6082202e663ec084b13703b31be4c8ff4679b5a7451883f07c25d"} Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.183905 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-m424n"] Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.186460 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.190343 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-6v9fp" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.190342 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.190458 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.199721 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-m424n"] Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.254779 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpqm\" (UniqueName: \"kubernetes.io/projected/4207d6ad-eef4-44d0-9eb5-814f9ec323ad-kube-api-access-cjpqm\") pod \"cluster-logging-operator-c769fd969-m424n\" (UID: \"4207d6ad-eef4-44d0-9eb5-814f9ec323ad\") " pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.356283 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpqm\" (UniqueName: \"kubernetes.io/projected/4207d6ad-eef4-44d0-9eb5-814f9ec323ad-kube-api-access-cjpqm\") pod \"cluster-logging-operator-c769fd969-m424n\" (UID: \"4207d6ad-eef4-44d0-9eb5-814f9ec323ad\") " pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.381072 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpqm\" (UniqueName: \"kubernetes.io/projected/4207d6ad-eef4-44d0-9eb5-814f9ec323ad-kube-api-access-cjpqm\") pod \"cluster-logging-operator-c769fd969-m424n\" (UID: \"4207d6ad-eef4-44d0-9eb5-814f9ec323ad\") " pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" Feb 17 14:20:03 crc kubenswrapper[4762]: I0217 14:20:03.513068 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" Feb 17 14:20:04 crc kubenswrapper[4762]: I0217 14:20:04.795721 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-m424n"] Feb 17 14:20:04 crc kubenswrapper[4762]: W0217 14:20:04.801895 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4207d6ad_eef4_44d0_9eb5_814f9ec323ad.slice/crio-18a1026aac789dc921a1095308ea26f754c5983a3614a133d174b50d579f16c8 WatchSource:0}: Error finding container 18a1026aac789dc921a1095308ea26f754c5983a3614a133d174b50d579f16c8: Status 404 returned error can't find the container with id 18a1026aac789dc921a1095308ea26f754c5983a3614a133d174b50d579f16c8 Feb 17 14:20:05 crc kubenswrapper[4762]: I0217 14:20:05.631588 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" event={"ID":"4207d6ad-eef4-44d0-9eb5-814f9ec323ad","Type":"ContainerStarted","Data":"18a1026aac789dc921a1095308ea26f754c5983a3614a133d174b50d579f16c8"} Feb 17 14:20:11 crc kubenswrapper[4762]: I0217 14:20:11.003425 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" event={"ID":"425e262b-13e9-474a-85f5-1a0501569aa9","Type":"ContainerStarted","Data":"5a2b2c3471a7d4400ce659435ef92e98f394ddd4e1d2c12fcab586911d8287b5"} Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.179960 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q59t8"] Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.181796 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.207806 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q59t8"] Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.325284 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-utilities\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.325340 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-catalog-content\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.325426 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqm2\" (UniqueName: \"kubernetes.io/projected/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-kube-api-access-mtqm2\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.426764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqm2\" (UniqueName: \"kubernetes.io/projected/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-kube-api-access-mtqm2\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.426858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-utilities\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.426897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-catalog-content\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.427564 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-catalog-content\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.428525 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-utilities\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.444741 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqm2\" (UniqueName: \"kubernetes.io/projected/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-kube-api-access-mtqm2\") pod \"redhat-marketplace-q59t8\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:19 crc kubenswrapper[4762]: I0217 14:20:19.501923 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:23 crc kubenswrapper[4762]: I0217 14:20:23.813873 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q59t8"] Feb 17 14:20:23 crc kubenswrapper[4762]: W0217 14:20:23.818414 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ad6fb3_d34a_4e2e_a675_42d195c7a15d.slice/crio-37862afb6df0a0b4bfe5e401d42c637cce7c9dd8b196cb5f94eff49c1e9ac6ae WatchSource:0}: Error finding container 37862afb6df0a0b4bfe5e401d42c637cce7c9dd8b196cb5f94eff49c1e9ac6ae: Status 404 returned error can't find the container with id 37862afb6df0a0b4bfe5e401d42c637cce7c9dd8b196cb5f94eff49c1e9ac6ae Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.217366 4762 generic.go:334] "Generic (PLEG): container finished" podID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerID="080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da" exitCode=0 Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.217490 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerDied","Data":"080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da"} Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.217529 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerStarted","Data":"37862afb6df0a0b4bfe5e401d42c637cce7c9dd8b196cb5f94eff49c1e9ac6ae"} Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.224201 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" event={"ID":"425e262b-13e9-474a-85f5-1a0501569aa9","Type":"ContainerStarted","Data":"c9c909d76fda30f8c99b63d336e3b998e0e7a3ac159d8a7cb9fed5a437724609"} Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.224357 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.226892 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" event={"ID":"4207d6ad-eef4-44d0-9eb5-814f9ec323ad","Type":"ContainerStarted","Data":"f957340eb27669b84497be441562a9b0d59d8136d02ef7d259d384060b448e4d"} Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.228023 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.262381 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-59cfb98864-gc6tj" podStartSLOduration=2.396072204 podStartE2EDuration="25.26236368s" podCreationTimestamp="2026-02-17 14:19:59 +0000 UTC" firstStartedPulling="2026-02-17 14:20:00.520208298 +0000 UTC m=+881.100208940" lastFinishedPulling="2026-02-17 14:20:23.386499764 +0000 UTC m=+903.966500416" observedRunningTime="2026-02-17 14:20:24.258739521 +0000 UTC m=+904.838740173" watchObservedRunningTime="2026-02-17 14:20:24.26236368 +0000 UTC m=+904.842364332" Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.283846 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-c769fd969-m424n" podStartSLOduration=2.793272667 podStartE2EDuration="21.283828834s" podCreationTimestamp="2026-02-17 14:20:03 +0000 UTC" firstStartedPulling="2026-02-17 14:20:04.803563483 +0000 UTC m=+885.383564155" lastFinishedPulling="2026-02-17 14:20:23.29411967 +0000 UTC m=+903.874120322" observedRunningTime="2026-02-17 14:20:24.280553635 +0000 UTC m=+904.860554297" watchObservedRunningTime="2026-02-17 14:20:24.283828834 +0000 UTC m=+904.863829486" Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.671141 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:20:24 crc kubenswrapper[4762]: I0217 14:20:24.671194 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:20:26 crc kubenswrapper[4762]: I0217 14:20:26.303103 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerStarted","Data":"80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414"} Feb 17 14:20:27 crc kubenswrapper[4762]: I0217 14:20:27.310731 4762 generic.go:334] "Generic (PLEG): container finished" podID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerID="80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414" exitCode=0 Feb 17 14:20:27 crc kubenswrapper[4762]: I0217 14:20:27.310823 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerDied","Data":"80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414"} Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.753934 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.755053 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.811512 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.811541 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.818217 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.920555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") " pod="minio-dev/minio" Feb 17 14:20:28 crc kubenswrapper[4762]: I0217 14:20:28.920615 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hfq4\" (UniqueName: \"kubernetes.io/projected/50d51776-6e7c-4ffe-a40d-01f268e35537-kube-api-access-2hfq4\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") " pod="minio-dev/minio" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.022481 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") " pod="minio-dev/minio" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.022581 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hfq4\" (UniqueName: \"kubernetes.io/projected/50d51776-6e7c-4ffe-a40d-01f268e35537-kube-api-access-2hfq4\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") " pod="minio-dev/minio" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.040734 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hfq4\" (UniqueName: \"kubernetes.io/projected/50d51776-6e7c-4ffe-a40d-01f268e35537-kube-api-access-2hfq4\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") " pod="minio-dev/minio" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.168911 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.169175 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/44cf2ed250c5616877a7822cb7df81bf8e07481f3acb8338256f88b7dd4c8b79/globalmount\"" pod="minio-dev/minio" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.324838 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerStarted","Data":"6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b"} Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.345560 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q59t8" podStartSLOduration=6.709166001 podStartE2EDuration="10.345544859s" podCreationTimestamp="2026-02-17 14:20:19 +0000 UTC" firstStartedPulling="2026-02-17 14:20:24.219509613 +0000 UTC m=+904.799510285" lastFinishedPulling="2026-02-17 14:20:27.855888491 +0000 UTC m=+908.435889143" observedRunningTime="2026-02-17 14:20:29.344148141 +0000 UTC m=+909.924148793" watchObservedRunningTime="2026-02-17 14:20:29.345544859 +0000 UTC m=+909.925545511" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.489398 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-129c3aea-24d1-40da-ab50-2e7b47badb9d\") pod \"minio\" (UID: \"50d51776-6e7c-4ffe-a40d-01f268e35537\") " pod="minio-dev/minio" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.503058 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.503126 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:29 crc kubenswrapper[4762]: I0217 14:20:29.729069 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 17 14:20:30 crc kubenswrapper[4762]: I0217 14:20:30.576184 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-q59t8" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="registry-server" probeResult="failure" output=< Feb 17 14:20:30 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:20:30 crc kubenswrapper[4762]: > Feb 17 14:20:30 crc kubenswrapper[4762]: I0217 14:20:30.765215 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 17 14:20:30 crc kubenswrapper[4762]: W0217 14:20:30.781549 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d51776_6e7c_4ffe_a40d_01f268e35537.slice/crio-01dc8e39e90708c7e25d0e5a9e40ebffa76db39502b57eb87c30a1ca62c1a473 WatchSource:0}: Error finding container 01dc8e39e90708c7e25d0e5a9e40ebffa76db39502b57eb87c30a1ca62c1a473: Status 404 returned error can't find the container with id 01dc8e39e90708c7e25d0e5a9e40ebffa76db39502b57eb87c30a1ca62c1a473 Feb 17 14:20:31 crc kubenswrapper[4762]: I0217 14:20:31.662272 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"50d51776-6e7c-4ffe-a40d-01f268e35537","Type":"ContainerStarted","Data":"01dc8e39e90708c7e25d0e5a9e40ebffa76db39502b57eb87c30a1ca62c1a473"} Feb 17 14:20:39 crc kubenswrapper[4762]: I0217 14:20:39.695999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:39 crc kubenswrapper[4762]: I0217 14:20:39.747337 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:39 crc kubenswrapper[4762]: I0217 14:20:39.928137 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q59t8"] Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.099335 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q59t8" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="registry-server" containerID="cri-o://6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b" gracePeriod=2 Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.717331 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.861914 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-catalog-content\") pod \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.862148 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtqm2\" (UniqueName: \"kubernetes.io/projected/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-kube-api-access-mtqm2\") pod \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.862199 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-utilities\") pod \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\" (UID: \"56ad6fb3-d34a-4e2e-a675-42d195c7a15d\") " Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.863264 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-utilities" (OuterVolumeSpecName: "utilities") pod "56ad6fb3-d34a-4e2e-a675-42d195c7a15d" (UID: "56ad6fb3-d34a-4e2e-a675-42d195c7a15d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.867495 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-kube-api-access-mtqm2" (OuterVolumeSpecName: "kube-api-access-mtqm2") pod "56ad6fb3-d34a-4e2e-a675-42d195c7a15d" (UID: "56ad6fb3-d34a-4e2e-a675-42d195c7a15d"). InnerVolumeSpecName "kube-api-access-mtqm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.901366 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56ad6fb3-d34a-4e2e-a675-42d195c7a15d" (UID: "56ad6fb3-d34a-4e2e-a675-42d195c7a15d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.963279 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.963312 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:41 crc kubenswrapper[4762]: I0217 14:20:41.963324 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtqm2\" (UniqueName: \"kubernetes.io/projected/56ad6fb3-d34a-4e2e-a675-42d195c7a15d-kube-api-access-mtqm2\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.105985 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"50d51776-6e7c-4ffe-a40d-01f268e35537","Type":"ContainerStarted","Data":"e6e8eaac9fb4a731c94235107e7f895d32ae38da7cd57f83b6f05cc2868020b5"} Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.109001 4762 generic.go:334] "Generic (PLEG): container finished" podID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerID="6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b" exitCode=0 Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.109038 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q59t8" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.109043 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerDied","Data":"6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b"} Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.109073 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q59t8" event={"ID":"56ad6fb3-d34a-4e2e-a675-42d195c7a15d","Type":"ContainerDied","Data":"37862afb6df0a0b4bfe5e401d42c637cce7c9dd8b196cb5f94eff49c1e9ac6ae"} Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.109094 4762 scope.go:117] "RemoveContainer" containerID="6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.118802 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.19594951 podStartE2EDuration="16.118780973s" podCreationTimestamp="2026-02-17 14:20:26 +0000 UTC" firstStartedPulling="2026-02-17 14:20:30.784971526 +0000 UTC m=+911.364972178" lastFinishedPulling="2026-02-17 14:20:41.707802989 +0000 UTC m=+922.287803641" observedRunningTime="2026-02-17 14:20:42.116896922 +0000 UTC m=+922.696897574" watchObservedRunningTime="2026-02-17 14:20:42.118780973 +0000 UTC m=+922.698781625" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.131566 4762 scope.go:117] "RemoveContainer" containerID="80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.137295 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q59t8"] Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.152309 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q59t8"] Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.159608 4762 scope.go:117] "RemoveContainer" containerID="080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.189783 4762 scope.go:117] "RemoveContainer" containerID="6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b" Feb 17 14:20:42 crc kubenswrapper[4762]: E0217 14:20:42.190351 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b\": container with ID starting with 6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b not found: ID does not exist" containerID="6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.190398 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b"} err="failed to get container status \"6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b\": rpc error: code = NotFound desc = could not find container \"6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b\": container with ID starting with 6033c5ad5fdff429e020f49aa8c209f6f84bf5cf8831620c6fb316329497ef0b not found: ID does not exist" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.190426 4762 scope.go:117] "RemoveContainer" containerID="80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414" Feb 17 14:20:42 crc kubenswrapper[4762]: E0217 14:20:42.190952 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414\": container with ID starting with 80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414 not found: ID does not exist" containerID="80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.190979 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414"} err="failed to get container status \"80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414\": rpc error: code = NotFound desc = could not find container \"80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414\": container with ID starting with 80e56427f0c808b6444d443fd957fbb3df7fe5ec537899cc592388aa3172c414 not found: ID does not exist" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.190998 4762 scope.go:117] "RemoveContainer" containerID="080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da" Feb 17 14:20:42 crc kubenswrapper[4762]: E0217 14:20:42.191198 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da\": container with ID starting with 080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da not found: ID does not exist" containerID="080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da" Feb 17 14:20:42 crc kubenswrapper[4762]: I0217 14:20:42.191220 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da"} err="failed to get container status \"080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da\": rpc error: code = NotFound desc = could not find container \"080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da\": container with ID starting with 080dd211cc7fa8a344a1269773b644b5aba6290d6ae15673941c01b2d44829da not found: ID does not exist" Feb 17 14:20:42 crc kubenswrapper[4762]: E0217 14:20:42.276387 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ad6fb3_d34a_4e2e_a675_42d195c7a15d.slice\": RecentStats: unable to find data in memory cache]" Feb 17 14:20:44 crc kubenswrapper[4762]: I0217 14:20:44.078736 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" path="/var/lib/kubelet/pods/56ad6fb3-d34a-4e2e-a675-42d195c7a15d/volumes" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.722178 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t"] Feb 17 14:20:48 crc kubenswrapper[4762]: E0217 14:20:48.725749 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="registry-server" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.725772 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="registry-server" Feb 17 14:20:48 crc kubenswrapper[4762]: E0217 14:20:48.725790 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="extract-content" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.725798 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="extract-content" Feb 17 14:20:48 crc kubenswrapper[4762]: E0217 14:20:48.725828 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="extract-utilities" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.725837 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="extract-utilities" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.725990 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ad6fb3-d34a-4e2e-a675-42d195c7a15d" containerName="registry-server" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.726566 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.732891 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.733588 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.733662 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-q4t8c" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.733764 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.733782 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.745964 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t"] Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.856800 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.856980 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.857010 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7th5k\" (UniqueName: \"kubernetes.io/projected/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-kube-api-access-7th5k\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.857074 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-config\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.857099 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.958134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.958192 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7th5k\" (UniqueName: \"kubernetes.io/projected/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-kube-api-access-7th5k\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.958244 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-config\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.958267 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.958320 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.959281 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.959410 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-config\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:48 crc kubenswrapper[4762]: I0217 14:20:48.980420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.025554 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.042366 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.049594 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7th5k\" (UniqueName: \"kubernetes.io/projected/c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1-kube-api-access-7th5k\") pod \"logging-loki-distributor-5d5548c9f5-4kq9t\" (UID: \"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.059455 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.061403 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.063356 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.063578 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.064833 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.142173 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.143932 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.148159 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.148398 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.164739 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.164787 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrz7s\" (UniqueName: \"kubernetes.io/projected/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-kube-api-access-vrz7s\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.164849 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.164874 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-config\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.164939 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.164969 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.167712 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.269127 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.269759 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.269826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-config\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.269884 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b87d089-b22d-483e-88c7-4d4c2e13c566-config\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.269935 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.269974 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.270003 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwxd\" (UniqueName: \"kubernetes.io/projected/6b87d089-b22d-483e-88c7-4d4c2e13c566-kube-api-access-mpwxd\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.270041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.270098 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.270127 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrz7s\" (UniqueName: \"kubernetes.io/projected/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-kube-api-access-vrz7s\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.270167 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.270411 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.272875 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.278073 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.279050 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.283727 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-config\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.289622 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.292438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.299412 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.299580 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.300241 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.299430 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.300413 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.316057 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrz7s\" (UniqueName: \"kubernetes.io/projected/5fed95ad-ee31-4f63-a4ef-4eaf471c49ee-kube-api-access-vrz7s\") pod \"logging-loki-querier-76bf7b6d45-rfqd7\" (UID: \"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.345124 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.345325 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.346293 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.355375 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-bzk75" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.363351 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372392 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372460 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tenants\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372515 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372549 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-686nf\" (UniqueName: \"kubernetes.io/projected/8a1683ec-0421-4086-8422-8a638b768879-kube-api-access-686nf\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372581 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372626 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-lokistack-gateway\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372671 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b87d089-b22d-483e-88c7-4d4c2e13c566-config\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-rbac\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372722 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372798 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.372848 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwxd\" (UniqueName: \"kubernetes.io/projected/6b87d089-b22d-483e-88c7-4d4c2e13c566-kube-api-access-mpwxd\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.374058 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.374458 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b87d089-b22d-483e-88c7-4d4c2e13c566-config\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.374459 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.377354 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.378032 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/6b87d089-b22d-483e-88c7-4d4c2e13c566-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.389229 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.397049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwxd\" (UniqueName: \"kubernetes.io/projected/6b87d089-b22d-483e-88c7-4d4c2e13c566-kube-api-access-mpwxd\") pod \"logging-loki-query-frontend-6d6859c548-lm9mq\" (UID: \"6b87d089-b22d-483e-88c7-4d4c2e13c566\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.465998 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474373 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-686nf\" (UniqueName: \"kubernetes.io/projected/8a1683ec-0421-4086-8422-8a638b768879-kube-api-access-686nf\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-lokistack-gateway\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474565 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474586 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bts88\" (UniqueName: \"kubernetes.io/projected/a4bee09c-f081-4ca0-aef8-40effbd263dd-kube-api-access-bts88\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474608 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-rbac\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474629 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474676 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474701 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474719 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-rbac\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474741 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-tenants\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474790 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474848 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-lokistack-gateway\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.474902 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tenants\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.479167 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tenants\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: E0217 14:20:49.479332 4762 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Feb 17 14:20:49 crc kubenswrapper[4762]: E0217 14:20:49.479390 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tls-secret podName:8a1683ec-0421-4086-8422-8a638b768879 nodeName:}" failed. No retries permitted until 2026-02-17 14:20:49.979368491 +0000 UTC m=+930.559369333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tls-secret") pod "logging-loki-gateway-78d96f4c68-sf9z2" (UID: "8a1683ec-0421-4086-8422-8a638b768879") : secret "logging-loki-gateway-http" not found Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.479726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-lokistack-gateway\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.480030 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.480183 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-rbac\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.480267 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.480598 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.514154 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-686nf\" (UniqueName: \"kubernetes.io/projected/8a1683ec-0421-4086-8422-8a638b768879-kube-api-access-686nf\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576446 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576527 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-lokistack-gateway\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576576 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576669 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576691 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bts88\" (UniqueName: \"kubernetes.io/projected/a4bee09c-f081-4ca0-aef8-40effbd263dd-kube-api-access-bts88\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-rbac\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.576748 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-tenants\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.578060 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.578615 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-rbac\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.581170 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-lokistack-gateway\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.584230 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-ca-bundle\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.586331 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.589414 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-tenants\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.599760 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a4bee09c-f081-4ca0-aef8-40effbd263dd-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.602385 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bts88\" (UniqueName: \"kubernetes.io/projected/a4bee09c-f081-4ca0-aef8-40effbd263dd-kube-api-access-bts88\") pod \"logging-loki-gateway-78d96f4c68-9bhm5\" (UID: \"a4bee09c-f081-4ca0-aef8-40effbd263dd\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.699025 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.888156 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.889688 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.895255 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.895777 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.909202 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.960307 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.974757 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7"] Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982606 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a72999-d771-4b3e-ba91-38078274aa35-config\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982721 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982747 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982780 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982820 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982852 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.982917 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npgln\" (UniqueName: \"kubernetes.io/projected/f7a72999-d771-4b3e-ba91-38078274aa35-kube-api-access-npgln\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:49 crc kubenswrapper[4762]: I0217 14:20:49.988351 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8a1683ec-0421-4086-8422-8a638b768879-tls-secret\") pod \"logging-loki-gateway-78d96f4c68-sf9z2\" (UID: \"8a1683ec-0421-4086-8422-8a638b768879\") " pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084668 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084722 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npgln\" (UniqueName: \"kubernetes.io/projected/f7a72999-d771-4b3e-ba91-38078274aa35-kube-api-access-npgln\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a72999-d771-4b3e-ba91-38078274aa35-config\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084942 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.084971 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.085016 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.088061 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.089104 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a72999-d771-4b3e-ba91-38078274aa35-config\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.095839 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.095871 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c554ce24147306b6cfc74c8b666dd77ec260eb7fee6ebdef771da74aac624378/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.096623 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.096678 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cb2a56d1e32b60710e63c5bab7dd01dcc9603f3bd70358da9901a0b1c82a3d1/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.108509 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.109175 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npgln\" (UniqueName: \"kubernetes.io/projected/f7a72999-d771-4b3e-ba91-38078274aa35-kube-api-access-npgln\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.113416 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.125504 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.126818 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.133702 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.133849 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.145418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f7a72999-d771-4b3e-ba91-38078274aa35-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.147281 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.186621 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.187037 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.187274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6pd\" (UniqueName: \"kubernetes.io/projected/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-kube-api-access-zx6pd\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.187342 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.187394 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.187454 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-config\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.187544 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.188157 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f2caf94-6eb5-4a83-974d-e98a4eaf4320\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.198159 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6d6b637-ab83-4afc-b9e1-7f7de0c54cdb\") pod \"logging-loki-ingester-0\" (UID: \"f7a72999-d771-4b3e-ba91-38078274aa35\") " pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.212603 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" event={"ID":"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1","Type":"ContainerStarted","Data":"85ed400534cf9cfb6987c1ba03ca74603ba17bd7e22b967e1812297d87459d5a"} Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.213805 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.215011 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.224108 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.226490 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" event={"ID":"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee","Type":"ContainerStarted","Data":"bd6333c4adbeb1c652eddb244aa28d0421754c9d87273d03f8daef28534dee44"} Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.227307 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.249701 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.258253 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.282128 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.292753 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.292825 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.292876 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-config\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.292927 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.292998 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.293028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.293057 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6pd\" (UniqueName: \"kubernetes.io/projected/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-kube-api-access-zx6pd\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.294015 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.294797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-config\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.297855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.300323 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.303252 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.303341 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b0e016a20a2e1568a1a71071c900f01ce057cf3982286537e2f91a2f4536207/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.305737 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.310000 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6pd\" (UniqueName: \"kubernetes.io/projected/42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c-kube-api-access-zx6pd\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.335500 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d364e47-5da2-47c1-9e45-2eb0aad42eeb\") pod \"logging-loki-compactor-0\" (UID: \"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c\") " pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394317 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrcm\" (UniqueName: \"kubernetes.io/projected/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-kube-api-access-gkrcm\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394402 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0642d47-5d43-4050-afd6-ff53c3106669\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0642d47-5d43-4050-afd6-ff53c3106669\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394451 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-config\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394505 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.394601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.405160 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq"] Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495727 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0642d47-5d43-4050-afd6-ff53c3106669\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0642d47-5d43-4050-afd6-ff53c3106669\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495796 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-config\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495878 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495943 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.495983 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrcm\" (UniqueName: \"kubernetes.io/projected/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-kube-api-access-gkrcm\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.497236 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-config\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.498308 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.498901 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.498932 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0642d47-5d43-4050-afd6-ff53c3106669\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0642d47-5d43-4050-afd6-ff53c3106669\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6d105044919932d54dbf4f981f8a77f47e0e3559be8ee903bb4ca8258c1d8b48/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.506619 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.506949 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.507261 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.515027 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrcm\" (UniqueName: \"kubernetes.io/projected/c6d7c750-d784-4839-b9a6-8dc6348e3a7c-kube-api-access-gkrcm\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: W0217 14:20:50.536995 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4bee09c_f081_4ca0_aef8_40effbd263dd.slice/crio-682bb866b6d5ce5f36f802fc25b26c3a8a14a67b253ba1278e2dc5345fe564e1 WatchSource:0}: Error finding container 682bb866b6d5ce5f36f802fc25b26c3a8a14a67b253ba1278e2dc5345fe564e1: Status 404 returned error can't find the container with id 682bb866b6d5ce5f36f802fc25b26c3a8a14a67b253ba1278e2dc5345fe564e1 Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.537452 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5"] Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.539866 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.552948 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0642d47-5d43-4050-afd6-ff53c3106669\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0642d47-5d43-4050-afd6-ff53c3106669\") pod \"logging-loki-index-gateway-0\" (UID: \"c6d7c750-d784-4839-b9a6-8dc6348e3a7c\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.568308 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2"] Feb 17 14:20:50 crc kubenswrapper[4762]: W0217 14:20:50.583005 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a1683ec_0421_4086_8422_8a638b768879.slice/crio-48842cfcde60e3bfa544d357da1e9a20442954b2579eac62ab7beb356a38b382 WatchSource:0}: Error finding container 48842cfcde60e3bfa544d357da1e9a20442954b2579eac62ab7beb356a38b382: Status 404 returned error can't find the container with id 48842cfcde60e3bfa544d357da1e9a20442954b2579eac62ab7beb356a38b382 Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.701610 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 17 14:20:50 crc kubenswrapper[4762]: W0217 14:20:50.719364 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7a72999_d771_4b3e_ba91_38078274aa35.slice/crio-ded6dac82755ed704f36540787b379acc42905b9c49ea5142191d0e0c6e2f496 WatchSource:0}: Error finding container ded6dac82755ed704f36540787b379acc42905b9c49ea5142191d0e0c6e2f496: Status 404 returned error can't find the container with id ded6dac82755ed704f36540787b379acc42905b9c49ea5142191d0e0c6e2f496 Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.854198 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:50 crc kubenswrapper[4762]: I0217 14:20:50.977424 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 17 14:20:51 crc kubenswrapper[4762]: I0217 14:20:51.235923 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" event={"ID":"6b87d089-b22d-483e-88c7-4d4c2e13c566","Type":"ContainerStarted","Data":"14e538a189128dfb180551eadda38c97093bd26c74142854cb5cb42931b7ba0a"} Feb 17 14:20:51 crc kubenswrapper[4762]: I0217 14:20:51.237031 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c","Type":"ContainerStarted","Data":"27ec34767c9478e9b906791bcad70741c2e6461a2bb80f8d817cfc55eecb1b16"} Feb 17 14:20:51 crc kubenswrapper[4762]: I0217 14:20:51.237965 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" event={"ID":"8a1683ec-0421-4086-8422-8a638b768879","Type":"ContainerStarted","Data":"48842cfcde60e3bfa544d357da1e9a20442954b2579eac62ab7beb356a38b382"} Feb 17 14:20:51 crc kubenswrapper[4762]: I0217 14:20:51.238722 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" event={"ID":"a4bee09c-f081-4ca0-aef8-40effbd263dd","Type":"ContainerStarted","Data":"682bb866b6d5ce5f36f802fc25b26c3a8a14a67b253ba1278e2dc5345fe564e1"} Feb 17 14:20:51 crc kubenswrapper[4762]: I0217 14:20:51.239853 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"f7a72999-d771-4b3e-ba91-38078274aa35","Type":"ContainerStarted","Data":"ded6dac82755ed704f36540787b379acc42905b9c49ea5142191d0e0c6e2f496"} Feb 17 14:20:51 crc kubenswrapper[4762]: I0217 14:20:51.306689 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 17 14:20:52 crc kubenswrapper[4762]: I0217 14:20:52.250327 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"c6d7c750-d784-4839-b9a6-8dc6348e3a7c","Type":"ContainerStarted","Data":"bd6cd70402e8491e277bccc5cab9e162c8e0058b91192913b54beff53e2085ee"} Feb 17 14:20:54 crc kubenswrapper[4762]: I0217 14:20:54.621208 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:20:54 crc kubenswrapper[4762]: I0217 14:20:54.621958 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.295746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" event={"ID":"c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1","Type":"ContainerStarted","Data":"eeb6d0b615abfd2c3022e5506e739856be7f748d62aab82b639f0eba656db3f5"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.296300 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.298562 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"c6d7c750-d784-4839-b9a6-8dc6348e3a7c","Type":"ContainerStarted","Data":"20f8547965e2837fa2b572db47180877eba174ad92b5814cdb269023159e64bd"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.298705 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.301406 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"f7a72999-d771-4b3e-ba91-38078274aa35","Type":"ContainerStarted","Data":"1d10c7530611c29e1346ead08f697f52a4b1edcf9c93625bdd5cae6028b3d659"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.301529 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.303244 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" event={"ID":"6b87d089-b22d-483e-88c7-4d4c2e13c566","Type":"ContainerStarted","Data":"43f0e57b854a53503df7e6c279a82ad82325f90346397611eb5249a2a5502428"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.303306 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.304791 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c","Type":"ContainerStarted","Data":"68800643152a674e2f85546da06904db5e76c636c0774f0ea7b0e88aa9ae5938"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.304927 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.306934 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" event={"ID":"5fed95ad-ee31-4f63-a4ef-4eaf471c49ee","Type":"ContainerStarted","Data":"7d5f6850fad3d1e4f6bf0c6a6d83beb4bb42bfc5bc1669e52210e09be4f06350"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.307048 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.308602 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" event={"ID":"8a1683ec-0421-4086-8422-8a638b768879","Type":"ContainerStarted","Data":"24fc6c2fd4944d9905ca07fc197f50697229e8db75d5ae60088fa97192f5fea4"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.309996 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" event={"ID":"a4bee09c-f081-4ca0-aef8-40effbd263dd","Type":"ContainerStarted","Data":"7190f626f8e02cfdada1785a9ac87e95160b03e4556678fb49bf96d23052e26c"} Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.314874 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" podStartSLOduration=2.769885495 podStartE2EDuration="9.314861619s" podCreationTimestamp="2026-02-17 14:20:48 +0000 UTC" firstStartedPulling="2026-02-17 14:20:49.985760317 +0000 UTC m=+930.565760969" lastFinishedPulling="2026-02-17 14:20:56.530736431 +0000 UTC m=+937.110737093" observedRunningTime="2026-02-17 14:20:57.313481652 +0000 UTC m=+937.893482304" watchObservedRunningTime="2026-02-17 14:20:57.314861619 +0000 UTC m=+937.894862271" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.335681 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.514663713 podStartE2EDuration="9.335665651s" podCreationTimestamp="2026-02-17 14:20:48 +0000 UTC" firstStartedPulling="2026-02-17 14:20:50.726965026 +0000 UTC m=+931.306965688" lastFinishedPulling="2026-02-17 14:20:56.547966974 +0000 UTC m=+937.127967626" observedRunningTime="2026-02-17 14:20:57.329156212 +0000 UTC m=+937.909156864" watchObservedRunningTime="2026-02-17 14:20:57.335665651 +0000 UTC m=+937.915666303" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.351434 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" podStartSLOduration=2.209695633 podStartE2EDuration="8.351417804s" podCreationTimestamp="2026-02-17 14:20:49 +0000 UTC" firstStartedPulling="2026-02-17 14:20:50.407997211 +0000 UTC m=+930.987997863" lastFinishedPulling="2026-02-17 14:20:56.549719382 +0000 UTC m=+937.129720034" observedRunningTime="2026-02-17 14:20:57.346374335 +0000 UTC m=+937.926374987" watchObservedRunningTime="2026-02-17 14:20:57.351417804 +0000 UTC m=+937.931418456" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.367575 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.803517692 podStartE2EDuration="8.367552137s" podCreationTimestamp="2026-02-17 14:20:49 +0000 UTC" firstStartedPulling="2026-02-17 14:20:50.997301036 +0000 UTC m=+931.577301688" lastFinishedPulling="2026-02-17 14:20:56.561335481 +0000 UTC m=+937.141336133" observedRunningTime="2026-02-17 14:20:57.362005305 +0000 UTC m=+937.942005957" watchObservedRunningTime="2026-02-17 14:20:57.367552137 +0000 UTC m=+937.947552789" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.380673 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" podStartSLOduration=2.881920305 podStartE2EDuration="9.380639697s" podCreationTimestamp="2026-02-17 14:20:48 +0000 UTC" firstStartedPulling="2026-02-17 14:20:50.002015644 +0000 UTC m=+930.582016296" lastFinishedPulling="2026-02-17 14:20:56.500735036 +0000 UTC m=+937.080735688" observedRunningTime="2026-02-17 14:20:57.378397416 +0000 UTC m=+937.958398068" watchObservedRunningTime="2026-02-17 14:20:57.380639697 +0000 UTC m=+937.960640349" Feb 17 14:20:57 crc kubenswrapper[4762]: I0217 14:20:57.407426 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.205294404 podStartE2EDuration="8.407399933s" podCreationTimestamp="2026-02-17 14:20:49 +0000 UTC" firstStartedPulling="2026-02-17 14:20:51.307515291 +0000 UTC m=+931.887515943" lastFinishedPulling="2026-02-17 14:20:56.50962082 +0000 UTC m=+937.089621472" observedRunningTime="2026-02-17 14:20:57.399486035 +0000 UTC m=+937.979486677" watchObservedRunningTime="2026-02-17 14:20:57.407399933 +0000 UTC m=+937.987400585" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.336540 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" event={"ID":"a4bee09c-f081-4ca0-aef8-40effbd263dd","Type":"ContainerStarted","Data":"9ece9d526983ae7b3f6c08ea8d922dcc5adb5304387cc5614a3e3988d7e90150"} Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.336926 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.336944 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.339525 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" event={"ID":"8a1683ec-0421-4086-8422-8a638b768879","Type":"ContainerStarted","Data":"a4f9f5777e1cd4779ad4ed6a92df02d0c165a742a4b3acddb3738b7f0f6c2296"} Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.339772 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.347846 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.348199 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.353837 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.357685 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" podStartSLOduration=2.451019724 podStartE2EDuration="11.357671069s" podCreationTimestamp="2026-02-17 14:20:49 +0000 UTC" firstStartedPulling="2026-02-17 14:20:50.546869457 +0000 UTC m=+931.126870109" lastFinishedPulling="2026-02-17 14:20:59.453520812 +0000 UTC m=+940.033521454" observedRunningTime="2026-02-17 14:21:00.356001463 +0000 UTC m=+940.936002115" watchObservedRunningTime="2026-02-17 14:21:00.357671069 +0000 UTC m=+940.937671721" Feb 17 14:21:00 crc kubenswrapper[4762]: I0217 14:21:00.403478 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" podStartSLOduration=2.539708942 podStartE2EDuration="11.403461868s" podCreationTimestamp="2026-02-17 14:20:49 +0000 UTC" firstStartedPulling="2026-02-17 14:20:50.585387196 +0000 UTC m=+931.165387848" lastFinishedPulling="2026-02-17 14:20:59.449140122 +0000 UTC m=+940.029140774" observedRunningTime="2026-02-17 14:21:00.400093755 +0000 UTC m=+940.980094407" watchObservedRunningTime="2026-02-17 14:21:00.403461868 +0000 UTC m=+940.983462520" Feb 17 14:21:01 crc kubenswrapper[4762]: I0217 14:21:01.346051 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:21:01 crc kubenswrapper[4762]: I0217 14:21:01.355428 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-78d96f4c68-sf9z2" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.217743 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7558h"] Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.221996 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.243134 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7558h"] Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.292142 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwn4s\" (UniqueName: \"kubernetes.io/projected/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-kube-api-access-xwn4s\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.292385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-utilities\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.292463 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-catalog-content\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.417586 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwn4s\" (UniqueName: \"kubernetes.io/projected/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-kube-api-access-xwn4s\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.417682 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-utilities\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.417723 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-catalog-content\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.418325 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-catalog-content\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.418592 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-utilities\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.452388 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwn4s\" (UniqueName: \"kubernetes.io/projected/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-kube-api-access-xwn4s\") pod \"community-operators-7558h\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:16 crc kubenswrapper[4762]: I0217 14:21:16.540042 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:17 crc kubenswrapper[4762]: I0217 14:21:17.009998 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7558h"] Feb 17 14:21:17 crc kubenswrapper[4762]: W0217 14:21:17.027811 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040c9a6f_c6aa_4e11_9fdb_f578c55ab809.slice/crio-e8e716efa7f90404015abcc281bd3e89d254efbb23cfdc45636942ed1c52e106 WatchSource:0}: Error finding container e8e716efa7f90404015abcc281bd3e89d254efbb23cfdc45636942ed1c52e106: Status 404 returned error can't find the container with id e8e716efa7f90404015abcc281bd3e89d254efbb23cfdc45636942ed1c52e106 Feb 17 14:21:17 crc kubenswrapper[4762]: I0217 14:21:17.464059 4762 generic.go:334] "Generic (PLEG): container finished" podID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerID="f12c39f4103da191659a478a0f5fb617df517461bc4a1b6559efd84408754dce" exitCode=0 Feb 17 14:21:17 crc kubenswrapper[4762]: I0217 14:21:17.464161 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7558h" event={"ID":"040c9a6f-c6aa-4e11-9fdb-f578c55ab809","Type":"ContainerDied","Data":"f12c39f4103da191659a478a0f5fb617df517461bc4a1b6559efd84408754dce"} Feb 17 14:21:17 crc kubenswrapper[4762]: I0217 14:21:17.464365 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7558h" event={"ID":"040c9a6f-c6aa-4e11-9fdb-f578c55ab809","Type":"ContainerStarted","Data":"e8e716efa7f90404015abcc281bd3e89d254efbb23cfdc45636942ed1c52e106"} Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.355816 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-4kq9t" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.470037 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76bf7b6d45-rfqd7" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.474615 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-lm9mq" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.486536 4762 generic.go:334] "Generic (PLEG): container finished" podID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerID="c2e7955f5346190dd4e182aea68fd2755ed8c9d2aaab90892f2c9f45e9d4edb7" exitCode=0 Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.486578 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7558h" event={"ID":"040c9a6f-c6aa-4e11-9fdb-f578c55ab809","Type":"ContainerDied","Data":"c2e7955f5346190dd4e182aea68fd2755ed8c9d2aaab90892f2c9f45e9d4edb7"} Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.830604 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qhqqj"] Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.832003 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.852085 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qhqqj"] Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.872753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-utilities\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.872850 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ls6m\" (UniqueName: \"kubernetes.io/projected/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-kube-api-access-6ls6m\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.873014 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-catalog-content\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.974913 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-catalog-content\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.975250 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-utilities\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.975395 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ls6m\" (UniqueName: \"kubernetes.io/projected/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-kube-api-access-6ls6m\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.975669 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-utilities\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:19 crc kubenswrapper[4762]: I0217 14:21:19.975685 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-catalog-content\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.029676 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ls6m\" (UniqueName: \"kubernetes.io/projected/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-kube-api-access-6ls6m\") pod \"certified-operators-qhqqj\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.166468 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.233069 4762 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.233135 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f7a72999-d771-4b3e-ba91-38078274aa35" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.495432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7558h" event={"ID":"040c9a6f-c6aa-4e11-9fdb-f578c55ab809","Type":"ContainerStarted","Data":"c99d1dc1def6c238a0ffa550f6ff8be9c2dfeb7ff611ba649a8f3fd95921f1f2"} Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.522500 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7558h" podStartSLOduration=2.115602516 podStartE2EDuration="4.52248134s" podCreationTimestamp="2026-02-17 14:21:16 +0000 UTC" firstStartedPulling="2026-02-17 14:21:17.466347895 +0000 UTC m=+958.046348547" lastFinishedPulling="2026-02-17 14:21:19.873226719 +0000 UTC m=+960.453227371" observedRunningTime="2026-02-17 14:21:20.518539452 +0000 UTC m=+961.098540104" watchObservedRunningTime="2026-02-17 14:21:20.52248134 +0000 UTC m=+961.102481992" Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.554341 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.738416 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qhqqj"] Feb 17 14:21:20 crc kubenswrapper[4762]: W0217 14:21:20.744808 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6b5b326_ce22_4423_ae59_bcb2d6595dfd.slice/crio-263c5babab68704f2c68499b3fa3fcbf94bd4edff9ee15944208e002e5ede8da WatchSource:0}: Error finding container 263c5babab68704f2c68499b3fa3fcbf94bd4edff9ee15944208e002e5ede8da: Status 404 returned error can't find the container with id 263c5babab68704f2c68499b3fa3fcbf94bd4edff9ee15944208e002e5ede8da Feb 17 14:21:20 crc kubenswrapper[4762]: I0217 14:21:20.884894 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Feb 17 14:21:21 crc kubenswrapper[4762]: I0217 14:21:21.508547 4762 generic.go:334] "Generic (PLEG): container finished" podID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerID="c0b2f8bd5f01e24014440e31537477c765333f5101f68ec72ad162003389ac7d" exitCode=0 Feb 17 14:21:21 crc kubenswrapper[4762]: I0217 14:21:21.508823 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerDied","Data":"c0b2f8bd5f01e24014440e31537477c765333f5101f68ec72ad162003389ac7d"} Feb 17 14:21:21 crc kubenswrapper[4762]: I0217 14:21:21.508899 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerStarted","Data":"263c5babab68704f2c68499b3fa3fcbf94bd4edff9ee15944208e002e5ede8da"} Feb 17 14:21:22 crc kubenswrapper[4762]: I0217 14:21:22.517543 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerStarted","Data":"8228e9093ec0a6eaf45c21b3dcf4bd8ec07578feecc3a0ffb10267999b7e5c99"} Feb 17 14:21:23 crc kubenswrapper[4762]: I0217 14:21:23.527075 4762 generic.go:334] "Generic (PLEG): container finished" podID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerID="8228e9093ec0a6eaf45c21b3dcf4bd8ec07578feecc3a0ffb10267999b7e5c99" exitCode=0 Feb 17 14:21:23 crc kubenswrapper[4762]: I0217 14:21:23.527134 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerDied","Data":"8228e9093ec0a6eaf45c21b3dcf4bd8ec07578feecc3a0ffb10267999b7e5c99"} Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.543083 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerStarted","Data":"2bce65e41281515063ead3134cefbd50119354b5b545a5334a2865dc7c91e132"} Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.562665 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qhqqj" podStartSLOduration=3.056452575 podStartE2EDuration="5.562637938s" podCreationTimestamp="2026-02-17 14:21:19 +0000 UTC" firstStartedPulling="2026-02-17 14:21:21.510114872 +0000 UTC m=+962.090115524" lastFinishedPulling="2026-02-17 14:21:24.016300235 +0000 UTC m=+964.596300887" observedRunningTime="2026-02-17 14:21:24.560439498 +0000 UTC m=+965.140440160" watchObservedRunningTime="2026-02-17 14:21:24.562637938 +0000 UTC m=+965.142638590" Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.621248 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.621350 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.621408 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.622220 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a30a93d238cea1f8adefd72afd175112649379fa52475b885f21fda62dbe2cba"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:21:24 crc kubenswrapper[4762]: I0217 14:21:24.622303 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://a30a93d238cea1f8adefd72afd175112649379fa52475b885f21fda62dbe2cba" gracePeriod=600 Feb 17 14:21:25 crc kubenswrapper[4762]: I0217 14:21:25.553621 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="a30a93d238cea1f8adefd72afd175112649379fa52475b885f21fda62dbe2cba" exitCode=0 Feb 17 14:21:25 crc kubenswrapper[4762]: I0217 14:21:25.553682 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"a30a93d238cea1f8adefd72afd175112649379fa52475b885f21fda62dbe2cba"} Feb 17 14:21:25 crc kubenswrapper[4762]: I0217 14:21:25.554110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"ccc577972b61cd413548bab4efa2b49055d0a18dd9858698cc28b4b73b495bf9"} Feb 17 14:21:25 crc kubenswrapper[4762]: I0217 14:21:25.554132 4762 scope.go:117] "RemoveContainer" containerID="116572c4d79b2feaa81621e7ad3ce8410516799fe8d9dbdb26dfeae29390b841" Feb 17 14:21:26 crc kubenswrapper[4762]: I0217 14:21:26.541418 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:26 crc kubenswrapper[4762]: I0217 14:21:26.541720 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:26 crc kubenswrapper[4762]: I0217 14:21:26.585458 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:26 crc kubenswrapper[4762]: I0217 14:21:26.629015 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:27 crc kubenswrapper[4762]: I0217 14:21:27.794008 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7558h"] Feb 17 14:21:28 crc kubenswrapper[4762]: I0217 14:21:28.577390 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7558h" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="registry-server" containerID="cri-o://c99d1dc1def6c238a0ffa550f6ff8be9c2dfeb7ff611ba649a8f3fd95921f1f2" gracePeriod=2 Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.587209 4762 generic.go:334] "Generic (PLEG): container finished" podID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerID="c99d1dc1def6c238a0ffa550f6ff8be9c2dfeb7ff611ba649a8f3fd95921f1f2" exitCode=0 Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.587260 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7558h" event={"ID":"040c9a6f-c6aa-4e11-9fdb-f578c55ab809","Type":"ContainerDied","Data":"c99d1dc1def6c238a0ffa550f6ff8be9c2dfeb7ff611ba649a8f3fd95921f1f2"} Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.646808 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.772115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-utilities\") pod \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.772305 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-catalog-content\") pod \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.772447 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwn4s\" (UniqueName: \"kubernetes.io/projected/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-kube-api-access-xwn4s\") pod \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\" (UID: \"040c9a6f-c6aa-4e11-9fdb-f578c55ab809\") " Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.773412 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-utilities" (OuterVolumeSpecName: "utilities") pod "040c9a6f-c6aa-4e11-9fdb-f578c55ab809" (UID: "040c9a6f-c6aa-4e11-9fdb-f578c55ab809"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.777921 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-kube-api-access-xwn4s" (OuterVolumeSpecName: "kube-api-access-xwn4s") pod "040c9a6f-c6aa-4e11-9fdb-f578c55ab809" (UID: "040c9a6f-c6aa-4e11-9fdb-f578c55ab809"). InnerVolumeSpecName "kube-api-access-xwn4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.874872 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwn4s\" (UniqueName: \"kubernetes.io/projected/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-kube-api-access-xwn4s\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4762]: I0217 14:21:29.874922 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.167540 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.167913 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.216852 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.230682 4762 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.230737 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f7a72999-d771-4b3e-ba91-38078274aa35" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.355462 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "040c9a6f-c6aa-4e11-9fdb-f578c55ab809" (UID: "040c9a6f-c6aa-4e11-9fdb-f578c55ab809"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.382972 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/040c9a6f-c6aa-4e11-9fdb-f578c55ab809-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.603405 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7558h" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.603388 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7558h" event={"ID":"040c9a6f-c6aa-4e11-9fdb-f578c55ab809","Type":"ContainerDied","Data":"e8e716efa7f90404015abcc281bd3e89d254efbb23cfdc45636942ed1c52e106"} Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.604672 4762 scope.go:117] "RemoveContainer" containerID="c99d1dc1def6c238a0ffa550f6ff8be9c2dfeb7ff611ba649a8f3fd95921f1f2" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.620490 4762 scope.go:117] "RemoveContainer" containerID="c2e7955f5346190dd4e182aea68fd2755ed8c9d2aaab90892f2c9f45e9d4edb7" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.649805 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7558h"] Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.651133 4762 scope.go:117] "RemoveContainer" containerID="f12c39f4103da191659a478a0f5fb617df517461bc4a1b6559efd84408754dce" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.652523 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:30 crc kubenswrapper[4762]: I0217 14:21:30.654624 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7558h"] Feb 17 14:21:31 crc kubenswrapper[4762]: I0217 14:21:31.596890 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qhqqj"] Feb 17 14:21:32 crc kubenswrapper[4762]: I0217 14:21:32.079555 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" path="/var/lib/kubelet/pods/040c9a6f-c6aa-4e11-9fdb-f578c55ab809/volumes" Feb 17 14:21:32 crc kubenswrapper[4762]: I0217 14:21:32.617188 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qhqqj" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="registry-server" containerID="cri-o://2bce65e41281515063ead3134cefbd50119354b5b545a5334a2865dc7c91e132" gracePeriod=2 Feb 17 14:21:33 crc kubenswrapper[4762]: I0217 14:21:33.627038 4762 generic.go:334] "Generic (PLEG): container finished" podID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerID="2bce65e41281515063ead3134cefbd50119354b5b545a5334a2865dc7c91e132" exitCode=0 Feb 17 14:21:33 crc kubenswrapper[4762]: I0217 14:21:33.627308 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerDied","Data":"2bce65e41281515063ead3134cefbd50119354b5b545a5334a2865dc7c91e132"} Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.207814 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.299272 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ls6m\" (UniqueName: \"kubernetes.io/projected/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-kube-api-access-6ls6m\") pod \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.299396 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-utilities\") pod \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.299483 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-catalog-content\") pod \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\" (UID: \"e6b5b326-ce22-4423-ae59-bcb2d6595dfd\") " Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.301302 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-utilities" (OuterVolumeSpecName: "utilities") pod "e6b5b326-ce22-4423-ae59-bcb2d6595dfd" (UID: "e6b5b326-ce22-4423-ae59-bcb2d6595dfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.308969 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-kube-api-access-6ls6m" (OuterVolumeSpecName: "kube-api-access-6ls6m") pod "e6b5b326-ce22-4423-ae59-bcb2d6595dfd" (UID: "e6b5b326-ce22-4423-ae59-bcb2d6595dfd"). InnerVolumeSpecName "kube-api-access-6ls6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.349988 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6b5b326-ce22-4423-ae59-bcb2d6595dfd" (UID: "e6b5b326-ce22-4423-ae59-bcb2d6595dfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.400952 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.400995 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.401010 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ls6m\" (UniqueName: \"kubernetes.io/projected/e6b5b326-ce22-4423-ae59-bcb2d6595dfd-kube-api-access-6ls6m\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.636691 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhqqj" event={"ID":"e6b5b326-ce22-4423-ae59-bcb2d6595dfd","Type":"ContainerDied","Data":"263c5babab68704f2c68499b3fa3fcbf94bd4edff9ee15944208e002e5ede8da"} Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.636751 4762 scope.go:117] "RemoveContainer" containerID="2bce65e41281515063ead3134cefbd50119354b5b545a5334a2865dc7c91e132" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.636949 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhqqj" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.847855 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qhqqj"] Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.853110 4762 scope.go:117] "RemoveContainer" containerID="8228e9093ec0a6eaf45c21b3dcf4bd8ec07578feecc3a0ffb10267999b7e5c99" Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.855430 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qhqqj"] Feb 17 14:21:34 crc kubenswrapper[4762]: I0217 14:21:34.873248 4762 scope.go:117] "RemoveContainer" containerID="c0b2f8bd5f01e24014440e31537477c765333f5101f68ec72ad162003389ac7d" Feb 17 14:21:36 crc kubenswrapper[4762]: I0217 14:21:36.078827 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" path="/var/lib/kubelet/pods/e6b5b326-ce22-4423-ae59-bcb2d6595dfd/volumes" Feb 17 14:21:40 crc kubenswrapper[4762]: I0217 14:21:40.230348 4762 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Feb 17 14:21:40 crc kubenswrapper[4762]: I0217 14:21:40.230751 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f7a72999-d771-4b3e-ba91-38078274aa35" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:21:50 crc kubenswrapper[4762]: I0217 14:21:50.230213 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.977300 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-bg5l7"] Feb 17 14:22:07 crc kubenswrapper[4762]: E0217 14:22:07.979535 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="extract-utilities" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.979686 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="extract-utilities" Feb 17 14:22:07 crc kubenswrapper[4762]: E0217 14:22:07.979803 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="extract-utilities" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.979885 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="extract-utilities" Feb 17 14:22:07 crc kubenswrapper[4762]: E0217 14:22:07.979987 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="registry-server" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.980064 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="registry-server" Feb 17 14:22:07 crc kubenswrapper[4762]: E0217 14:22:07.980150 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="extract-content" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.980227 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="extract-content" Feb 17 14:22:07 crc kubenswrapper[4762]: E0217 14:22:07.980328 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="registry-server" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.980400 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="registry-server" Feb 17 14:22:07 crc kubenswrapper[4762]: E0217 14:22:07.980484 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="extract-content" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.980558 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="extract-content" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.981958 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b5b326-ce22-4423-ae59-bcb2d6595dfd" containerName="registry-server" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.982083 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="040c9a6f-c6aa-4e11-9fdb-f578c55ab809" containerName="registry-server" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.982911 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bg5l7" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.985068 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.985805 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.985934 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-rs64k" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.986406 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.986613 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Feb 17 14:22:07 crc kubenswrapper[4762]: I0217 14:22:07.993803 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.006655 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-bg5l7"] Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.064578 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-bg5l7"] Feb 17 14:22:08 crc kubenswrapper[4762]: E0217 14:22:08.065412 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-vqpbc metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-bg5l7" podUID="b9c4a06b-738b-4be4-87c2-eef667f518ba" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.074394 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-trusted-ca\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.074756 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-token\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.074904 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075022 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-sa-token\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075146 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9c4a06b-738b-4be4-87c2-eef667f518ba-tmp\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075283 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b9c4a06b-738b-4be4-87c2-eef667f518ba-datadir\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075400 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqpbc\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-kube-api-access-vqpbc\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075502 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config-openshift-service-cacrt\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075666 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075775 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-entrypoint\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.075903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-syslog-receiver\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.176999 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b9c4a06b-738b-4be4-87c2-eef667f518ba-datadir\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqpbc\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-kube-api-access-vqpbc\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177066 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config-openshift-service-cacrt\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177106 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177128 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-entrypoint\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177149 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b9c4a06b-738b-4be4-87c2-eef667f518ba-datadir\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-syslog-receiver\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177271 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-trusted-ca\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177391 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-token\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177457 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177480 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-sa-token\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.177513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9c4a06b-738b-4be4-87c2-eef667f518ba-tmp\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: E0217 14:22:08.178032 4762 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Feb 17 14:22:08 crc kubenswrapper[4762]: E0217 14:22:08.178105 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics podName:b9c4a06b-738b-4be4-87c2-eef667f518ba nodeName:}" failed. No retries permitted until 2026-02-17 14:22:08.678087756 +0000 UTC m=+1009.258088408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics") pod "collector-bg5l7" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba") : secret "collector-metrics" not found Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.178478 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config-openshift-service-cacrt\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.179102 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.179109 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-entrypoint\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.180538 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-trusted-ca\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.187065 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9c4a06b-738b-4be4-87c2-eef667f518ba-tmp\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.187537 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-syslog-receiver\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.187898 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-token\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.202630 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-sa-token\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.203577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqpbc\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-kube-api-access-vqpbc\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.684908 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.689399 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics\") pod \"collector-bg5l7\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.895580 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.905055 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bg5l7" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.989900 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-trusted-ca\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.989972 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-entrypoint\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990059 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-sa-token\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990104 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-syslog-receiver\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990209 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqpbc\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-kube-api-access-vqpbc\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990254 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config-openshift-service-cacrt\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990326 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9c4a06b-738b-4be4-87c2-eef667f518ba-tmp\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990366 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990398 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990441 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b9c4a06b-738b-4be4-87c2-eef667f518ba-datadir\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990483 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-token\") pod \"b9c4a06b-738b-4be4-87c2-eef667f518ba\" (UID: \"b9c4a06b-738b-4be4-87c2-eef667f518ba\") " Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990439 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.990559 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.991437 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9c4a06b-738b-4be4-87c2-eef667f518ba-datadir" (OuterVolumeSpecName: "datadir") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.992027 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config" (OuterVolumeSpecName: "config") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.992222 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.993498 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics" (OuterVolumeSpecName: "metrics") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.993740 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9c4a06b-738b-4be4-87c2-eef667f518ba-tmp" (OuterVolumeSpecName: "tmp") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.994374 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-token" (OuterVolumeSpecName: "collector-token") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:08 crc kubenswrapper[4762]: I0217 14:22:08.995755 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-sa-token" (OuterVolumeSpecName: "sa-token") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.002246 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.002892 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-kube-api-access-vqpbc" (OuterVolumeSpecName: "kube-api-access-vqpbc") pod "b9c4a06b-738b-4be4-87c2-eef667f518ba" (UID: "b9c4a06b-738b-4be4-87c2-eef667f518ba"). InnerVolumeSpecName "kube-api-access-vqpbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092455 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqpbc\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-kube-api-access-vqpbc\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092499 4762 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092510 4762 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9c4a06b-738b-4be4-87c2-eef667f518ba-tmp\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092522 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092533 4762 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092766 4762 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b9c4a06b-738b-4be4-87c2-eef667f518ba-datadir\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092780 4762 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092788 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092796 4762 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b9c4a06b-738b-4be4-87c2-eef667f518ba-entrypoint\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092804 4762 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b9c4a06b-738b-4be4-87c2-eef667f518ba-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.092813 4762 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b9c4a06b-738b-4be4-87c2-eef667f518ba-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.901395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bg5l7" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.959793 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-bg5l7"] Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.963727 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-bg5l7"] Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.968155 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-4jmff"] Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.969021 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-4jmff" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.971061 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.971283 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.971893 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-rs64k" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.976744 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Feb 17 14:22:09 crc kubenswrapper[4762]: I0217 14:22:09.978184 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.002208 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-4jmff"] Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.002863 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.080611 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c4a06b-738b-4be4-87c2-eef667f518ba" path="/var/lib/kubelet/pods/b9c4a06b-738b-4be4-87c2-eef667f518ba/volumes" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110096 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a515723d-c024-422f-ae28-6e5b5daeea76-tmp\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110190 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-config-openshift-service-cacrt\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-collector-token\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110417 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-metrics\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110533 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6fj\" (UniqueName: \"kubernetes.io/projected/a515723d-c024-422f-ae28-6e5b5daeea76-kube-api-access-6c6fj\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110711 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-collector-syslog-receiver\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.110763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/a515723d-c024-422f-ae28-6e5b5daeea76-datadir\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.111092 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-entrypoint\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.111152 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/a515723d-c024-422f-ae28-6e5b5daeea76-sa-token\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.111198 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-config\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.111234 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-trusted-ca\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212268 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-entrypoint\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212386 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/a515723d-c024-422f-ae28-6e5b5daeea76-sa-token\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212439 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-config\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212462 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-trusted-ca\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212524 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a515723d-c024-422f-ae28-6e5b5daeea76-tmp\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-config-openshift-service-cacrt\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212583 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-collector-token\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212630 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-metrics\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212673 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c6fj\" (UniqueName: \"kubernetes.io/projected/a515723d-c024-422f-ae28-6e5b5daeea76-kube-api-access-6c6fj\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-collector-syslog-receiver\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/a515723d-c024-422f-ae28-6e5b5daeea76-datadir\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.212866 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/a515723d-c024-422f-ae28-6e5b5daeea76-datadir\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.213766 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-config\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.214872 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-trusted-ca\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.214890 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-config-openshift-service-cacrt\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.215312 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/a515723d-c024-422f-ae28-6e5b5daeea76-entrypoint\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.217776 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a515723d-c024-422f-ae28-6e5b5daeea76-tmp\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.217998 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-collector-token\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.218833 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-metrics\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.219510 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/a515723d-c024-422f-ae28-6e5b5daeea76-collector-syslog-receiver\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.234021 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c6fj\" (UniqueName: \"kubernetes.io/projected/a515723d-c024-422f-ae28-6e5b5daeea76-kube-api-access-6c6fj\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.234490 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/a515723d-c024-422f-ae28-6e5b5daeea76-sa-token\") pod \"collector-4jmff\" (UID: \"a515723d-c024-422f-ae28-6e5b5daeea76\") " pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.294909 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-4jmff" Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.698859 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-4jmff"] Feb 17 14:22:10 crc kubenswrapper[4762]: I0217 14:22:10.908096 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-4jmff" event={"ID":"a515723d-c024-422f-ae28-6e5b5daeea76","Type":"ContainerStarted","Data":"b9d516ff69e7bea7b320321de45935510651fd568accb51352b781818e9990ac"} Feb 17 14:22:16 crc kubenswrapper[4762]: I0217 14:22:16.955917 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-4jmff" event={"ID":"a515723d-c024-422f-ae28-6e5b5daeea76","Type":"ContainerStarted","Data":"9587ac1e814fa063ffe56cc784446b1f4c01b9d0f3a395578e02fa2a1b89e631"} Feb 17 14:22:16 crc kubenswrapper[4762]: I0217 14:22:16.985632 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-4jmff" podStartSLOduration=2.13772505 podStartE2EDuration="7.985603387s" podCreationTimestamp="2026-02-17 14:22:09 +0000 UTC" firstStartedPulling="2026-02-17 14:22:10.707449635 +0000 UTC m=+1011.287450307" lastFinishedPulling="2026-02-17 14:22:16.555327982 +0000 UTC m=+1017.135328644" observedRunningTime="2026-02-17 14:22:16.97407722 +0000 UTC m=+1017.554077912" watchObservedRunningTime="2026-02-17 14:22:16.985603387 +0000 UTC m=+1017.565604049" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.488083 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m"] Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.490119 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.492228 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.497711 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m"] Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.596694 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd2kv\" (UniqueName: \"kubernetes.io/projected/ce29a95a-c876-4e03-8b7c-89994be40488-kube-api-access-wd2kv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.597146 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.597349 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.698694 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.698757 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.698891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd2kv\" (UniqueName: \"kubernetes.io/projected/ce29a95a-c876-4e03-8b7c-89994be40488-kube-api-access-wd2kv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.699230 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.699329 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.731639 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd2kv\" (UniqueName: \"kubernetes.io/projected/ce29a95a-c876-4e03-8b7c-89994be40488-kube-api-access-wd2kv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:52 crc kubenswrapper[4762]: I0217 14:22:52.812513 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:53 crc kubenswrapper[4762]: I0217 14:22:53.346326 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m"] Feb 17 14:22:53 crc kubenswrapper[4762]: I0217 14:22:53.444474 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" event={"ID":"ce29a95a-c876-4e03-8b7c-89994be40488","Type":"ContainerStarted","Data":"05e20086d7819b3bfe2d5f6c375839357c7ddac897dff50121ec45a250604115"} Feb 17 14:22:54 crc kubenswrapper[4762]: I0217 14:22:54.453259 4762 generic.go:334] "Generic (PLEG): container finished" podID="ce29a95a-c876-4e03-8b7c-89994be40488" containerID="37f46f3dae8fdca1ec14932973a6e670b1d845951b2e521c3716412ffab28e66" exitCode=0 Feb 17 14:22:54 crc kubenswrapper[4762]: I0217 14:22:54.453331 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" event={"ID":"ce29a95a-c876-4e03-8b7c-89994be40488","Type":"ContainerDied","Data":"37f46f3dae8fdca1ec14932973a6e670b1d845951b2e521c3716412ffab28e66"} Feb 17 14:22:56 crc kubenswrapper[4762]: I0217 14:22:56.469109 4762 generic.go:334] "Generic (PLEG): container finished" podID="ce29a95a-c876-4e03-8b7c-89994be40488" containerID="227c63e74a9d0565d3452ef336e4f75412988f232a8f450a5011d16dbbe4473f" exitCode=0 Feb 17 14:22:56 crc kubenswrapper[4762]: I0217 14:22:56.470264 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" event={"ID":"ce29a95a-c876-4e03-8b7c-89994be40488","Type":"ContainerDied","Data":"227c63e74a9d0565d3452ef336e4f75412988f232a8f450a5011d16dbbe4473f"} Feb 17 14:22:57 crc kubenswrapper[4762]: I0217 14:22:57.478986 4762 generic.go:334] "Generic (PLEG): container finished" podID="ce29a95a-c876-4e03-8b7c-89994be40488" containerID="6e770ca94d521b2df13fec8bf18bda3dfae0bc0348349b18bdc730b9b9914d10" exitCode=0 Feb 17 14:22:57 crc kubenswrapper[4762]: I0217 14:22:57.479309 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" event={"ID":"ce29a95a-c876-4e03-8b7c-89994be40488","Type":"ContainerDied","Data":"6e770ca94d521b2df13fec8bf18bda3dfae0bc0348349b18bdc730b9b9914d10"} Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.790317 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.902371 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd2kv\" (UniqueName: \"kubernetes.io/projected/ce29a95a-c876-4e03-8b7c-89994be40488-kube-api-access-wd2kv\") pod \"ce29a95a-c876-4e03-8b7c-89994be40488\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.902609 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-bundle\") pod \"ce29a95a-c876-4e03-8b7c-89994be40488\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.902717 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-util\") pod \"ce29a95a-c876-4e03-8b7c-89994be40488\" (UID: \"ce29a95a-c876-4e03-8b7c-89994be40488\") " Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.903307 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-bundle" (OuterVolumeSpecName: "bundle") pod "ce29a95a-c876-4e03-8b7c-89994be40488" (UID: "ce29a95a-c876-4e03-8b7c-89994be40488"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.909511 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce29a95a-c876-4e03-8b7c-89994be40488-kube-api-access-wd2kv" (OuterVolumeSpecName: "kube-api-access-wd2kv") pod "ce29a95a-c876-4e03-8b7c-89994be40488" (UID: "ce29a95a-c876-4e03-8b7c-89994be40488"). InnerVolumeSpecName "kube-api-access-wd2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:58 crc kubenswrapper[4762]: I0217 14:22:58.946414 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-util" (OuterVolumeSpecName: "util") pod "ce29a95a-c876-4e03-8b7c-89994be40488" (UID: "ce29a95a-c876-4e03-8b7c-89994be40488"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:59 crc kubenswrapper[4762]: I0217 14:22:59.004453 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:59 crc kubenswrapper[4762]: I0217 14:22:59.004497 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce29a95a-c876-4e03-8b7c-89994be40488-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:59 crc kubenswrapper[4762]: I0217 14:22:59.004509 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd2kv\" (UniqueName: \"kubernetes.io/projected/ce29a95a-c876-4e03-8b7c-89994be40488-kube-api-access-wd2kv\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:59 crc kubenswrapper[4762]: I0217 14:22:59.494454 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" event={"ID":"ce29a95a-c876-4e03-8b7c-89994be40488","Type":"ContainerDied","Data":"05e20086d7819b3bfe2d5f6c375839357c7ddac897dff50121ec45a250604115"} Feb 17 14:22:59 crc kubenswrapper[4762]: I0217 14:22:59.494498 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05e20086d7819b3bfe2d5f6c375839357c7ddac897dff50121ec45a250604115" Feb 17 14:22:59 crc kubenswrapper[4762]: I0217 14:22:59.494533 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.796766 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-ctz7n"] Feb 17 14:23:01 crc kubenswrapper[4762]: E0217 14:23:01.797508 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="extract" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.797527 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="extract" Feb 17 14:23:01 crc kubenswrapper[4762]: E0217 14:23:01.797554 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="pull" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.797561 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="pull" Feb 17 14:23:01 crc kubenswrapper[4762]: E0217 14:23:01.797583 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="util" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.797590 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="util" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.797779 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce29a95a-c876-4e03-8b7c-89994be40488" containerName="extract" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.798443 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.800982 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-76c9q" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.801073 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.801278 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.823750 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-ctz7n"] Feb 17 14:23:01 crc kubenswrapper[4762]: I0217 14:23:01.952214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvfcv\" (UniqueName: \"kubernetes.io/projected/7b234a38-b4bf-43c7-b406-127d6df3b021-kube-api-access-pvfcv\") pod \"nmstate-operator-694c9596b7-ctz7n\" (UID: \"7b234a38-b4bf-43c7-b406-127d6df3b021\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" Feb 17 14:23:02 crc kubenswrapper[4762]: I0217 14:23:02.055720 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvfcv\" (UniqueName: \"kubernetes.io/projected/7b234a38-b4bf-43c7-b406-127d6df3b021-kube-api-access-pvfcv\") pod \"nmstate-operator-694c9596b7-ctz7n\" (UID: \"7b234a38-b4bf-43c7-b406-127d6df3b021\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" Feb 17 14:23:02 crc kubenswrapper[4762]: I0217 14:23:02.105832 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvfcv\" (UniqueName: \"kubernetes.io/projected/7b234a38-b4bf-43c7-b406-127d6df3b021-kube-api-access-pvfcv\") pod \"nmstate-operator-694c9596b7-ctz7n\" (UID: \"7b234a38-b4bf-43c7-b406-127d6df3b021\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" Feb 17 14:23:02 crc kubenswrapper[4762]: I0217 14:23:02.120380 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" Feb 17 14:23:02 crc kubenswrapper[4762]: I0217 14:23:02.559672 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-ctz7n"] Feb 17 14:23:02 crc kubenswrapper[4762]: I0217 14:23:02.566887 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:23:03 crc kubenswrapper[4762]: I0217 14:23:03.520569 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" event={"ID":"7b234a38-b4bf-43c7-b406-127d6df3b021","Type":"ContainerStarted","Data":"8595209a6cbc9bc93a2b49acf1273f6b4bc257d28fe991bb7e02f3b91942c4bd"} Feb 17 14:23:05 crc kubenswrapper[4762]: I0217 14:23:05.543218 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" event={"ID":"7b234a38-b4bf-43c7-b406-127d6df3b021","Type":"ContainerStarted","Data":"d0305f9a0456f09e81c94e5365040bc9908dab19b93e3fe60733922dd875073f"} Feb 17 14:23:05 crc kubenswrapper[4762]: I0217 14:23:05.564839 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-ctz7n" podStartSLOduration=2.06742955 podStartE2EDuration="4.564817258s" podCreationTimestamp="2026-02-17 14:23:01 +0000 UTC" firstStartedPulling="2026-02-17 14:23:02.566597941 +0000 UTC m=+1063.146598593" lastFinishedPulling="2026-02-17 14:23:05.063985649 +0000 UTC m=+1065.643986301" observedRunningTime="2026-02-17 14:23:05.561770784 +0000 UTC m=+1066.141771516" watchObservedRunningTime="2026-02-17 14:23:05.564817258 +0000 UTC m=+1066.144817920" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.581832 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.583179 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.585325 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z2mjf" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.595306 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.596487 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.598158 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.613375 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.623946 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.640948 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-chbj9"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.642246 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.732921 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1a3455d0-6909-41ab-9c83-f5a96c9858d1-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tlsn7\" (UID: \"1a3455d0-6909-41ab-9c83-f5a96c9858d1\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.733036 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-dbus-socket\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.733081 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g8tp\" (UniqueName: \"kubernetes.io/projected/1a3455d0-6909-41ab-9c83-f5a96c9858d1-kube-api-access-2g8tp\") pod \"nmstate-webhook-866bcb46dc-tlsn7\" (UID: \"1a3455d0-6909-41ab-9c83-f5a96c9858d1\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.733117 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-nmstate-lock\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.733375 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btn7x\" (UniqueName: \"kubernetes.io/projected/384f1796-2d88-476c-be59-1abc8ee06efb-kube-api-access-btn7x\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.733437 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqpc\" (UniqueName: \"kubernetes.io/projected/d8c030bf-f09b-4f2d-9db7-b167348f912f-kube-api-access-mvqpc\") pod \"nmstate-metrics-58c85c668d-pg2bv\" (UID: \"d8c030bf-f09b-4f2d-9db7-b167348f912f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.733490 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-ovs-socket\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.774254 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.784995 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.794212 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.794783 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.795037 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qpwjx" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.796337 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm"] Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.834770 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btn7x\" (UniqueName: \"kubernetes.io/projected/384f1796-2d88-476c-be59-1abc8ee06efb-kube-api-access-btn7x\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.835070 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqpc\" (UniqueName: \"kubernetes.io/projected/d8c030bf-f09b-4f2d-9db7-b167348f912f-kube-api-access-mvqpc\") pod \"nmstate-metrics-58c85c668d-pg2bv\" (UID: \"d8c030bf-f09b-4f2d-9db7-b167348f912f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.835205 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-ovs-socket\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.835309 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1a3455d0-6909-41ab-9c83-f5a96c9858d1-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tlsn7\" (UID: \"1a3455d0-6909-41ab-9c83-f5a96c9858d1\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.835432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-dbus-socket\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.835533 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g8tp\" (UniqueName: \"kubernetes.io/projected/1a3455d0-6909-41ab-9c83-f5a96c9858d1-kube-api-access-2g8tp\") pod \"nmstate-webhook-866bcb46dc-tlsn7\" (UID: \"1a3455d0-6909-41ab-9c83-f5a96c9858d1\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.836798 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-nmstate-lock\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.837437 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-dbus-socket\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.835744 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-ovs-socket\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.837466 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/384f1796-2d88-476c-be59-1abc8ee06efb-nmstate-lock\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.852368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1a3455d0-6909-41ab-9c83-f5a96c9858d1-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tlsn7\" (UID: \"1a3455d0-6909-41ab-9c83-f5a96c9858d1\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.859945 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btn7x\" (UniqueName: \"kubernetes.io/projected/384f1796-2d88-476c-be59-1abc8ee06efb-kube-api-access-btn7x\") pod \"nmstate-handler-chbj9\" (UID: \"384f1796-2d88-476c-be59-1abc8ee06efb\") " pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.866349 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g8tp\" (UniqueName: \"kubernetes.io/projected/1a3455d0-6909-41ab-9c83-f5a96c9858d1-kube-api-access-2g8tp\") pod \"nmstate-webhook-866bcb46dc-tlsn7\" (UID: \"1a3455d0-6909-41ab-9c83-f5a96c9858d1\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.875428 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqpc\" (UniqueName: \"kubernetes.io/projected/d8c030bf-f09b-4f2d-9db7-b167348f912f-kube-api-access-mvqpc\") pod \"nmstate-metrics-58c85c668d-pg2bv\" (UID: \"d8c030bf-f09b-4f2d-9db7-b167348f912f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.903010 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.922172 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.944975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/676a0670-76e5-4a67-8afc-9e69c1561f26-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.945028 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q829j\" (UniqueName: \"kubernetes.io/projected/676a0670-76e5-4a67-8afc-9e69c1561f26-kube-api-access-q829j\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.945065 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/676a0670-76e5-4a67-8afc-9e69c1561f26-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:06 crc kubenswrapper[4762]: I0217 14:23:06.971027 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.049288 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/676a0670-76e5-4a67-8afc-9e69c1561f26-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.049603 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q829j\" (UniqueName: \"kubernetes.io/projected/676a0670-76e5-4a67-8afc-9e69c1561f26-kube-api-access-q829j\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.049662 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/676a0670-76e5-4a67-8afc-9e69c1561f26-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: E0217 14:23:07.049853 4762 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 17 14:23:07 crc kubenswrapper[4762]: E0217 14:23:07.049923 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676a0670-76e5-4a67-8afc-9e69c1561f26-plugin-serving-cert podName:676a0670-76e5-4a67-8afc-9e69c1561f26 nodeName:}" failed. No retries permitted until 2026-02-17 14:23:07.549898899 +0000 UTC m=+1068.129899541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/676a0670-76e5-4a67-8afc-9e69c1561f26-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-mwkcm" (UID: "676a0670-76e5-4a67-8afc-9e69c1561f26") : secret "plugin-serving-cert" not found Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.051244 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/676a0670-76e5-4a67-8afc-9e69c1561f26-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.120886 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q829j\" (UniqueName: \"kubernetes.io/projected/676a0670-76e5-4a67-8afc-9e69c1561f26-kube-api-access-q829j\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.129160 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77f76d465c-nhgvb"] Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.132026 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.144969 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77f76d465c-nhgvb"] Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.253707 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-trusted-ca-bundle\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.254054 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-service-ca\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.254196 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-oauth-config\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.254239 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-console-config\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.254386 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-serving-cert\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.254443 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-oauth-serving-cert\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.254476 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjf4\" (UniqueName: \"kubernetes.io/projected/5d85da49-7985-429f-b4ed-d81ab921b28a-kube-api-access-wgjf4\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356361 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-trusted-ca-bundle\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356406 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-service-ca\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356448 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-oauth-config\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356464 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-console-config\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356520 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-serving-cert\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356570 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-oauth-serving-cert\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.356588 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjf4\" (UniqueName: \"kubernetes.io/projected/5d85da49-7985-429f-b4ed-d81ab921b28a-kube-api-access-wgjf4\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.357565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-console-config\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.357744 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-trusted-ca-bundle\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.357561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-service-ca\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.358247 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-oauth-serving-cert\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.361922 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-oauth-config\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.362863 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-serving-cert\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.378671 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjf4\" (UniqueName: \"kubernetes.io/projected/5d85da49-7985-429f-b4ed-d81ab921b28a-kube-api-access-wgjf4\") pod \"console-77f76d465c-nhgvb\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.469287 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.552510 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7"] Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.559359 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-chbj9" event={"ID":"384f1796-2d88-476c-be59-1abc8ee06efb","Type":"ContainerStarted","Data":"d3b35d2be1449fda0a201030ee043d78fd25b89ac573f2f6b36c5b57197eed77"} Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.559784 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/676a0670-76e5-4a67-8afc-9e69c1561f26-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: W0217 14:23:07.562925 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a3455d0_6909_41ab_9c83_f5a96c9858d1.slice/crio-e0fdcc209972b64cc3b8bf923ebfd7d3ab5f0c7793404f75609290ae00b39655 WatchSource:0}: Error finding container e0fdcc209972b64cc3b8bf923ebfd7d3ab5f0c7793404f75609290ae00b39655: Status 404 returned error can't find the container with id e0fdcc209972b64cc3b8bf923ebfd7d3ab5f0c7793404f75609290ae00b39655 Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.563288 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/676a0670-76e5-4a67-8afc-9e69c1561f26-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-mwkcm\" (UID: \"676a0670-76e5-4a67-8afc-9e69c1561f26\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.647423 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv"] Feb 17 14:23:07 crc kubenswrapper[4762]: W0217 14:23:07.652797 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c030bf_f09b_4f2d_9db7_b167348f912f.slice/crio-48f1aa826140801a10fb2b7ddb4a5cf0b47d9138777d9c6b141754f6df508cef WatchSource:0}: Error finding container 48f1aa826140801a10fb2b7ddb4a5cf0b47d9138777d9c6b141754f6df508cef: Status 404 returned error can't find the container with id 48f1aa826140801a10fb2b7ddb4a5cf0b47d9138777d9c6b141754f6df508cef Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.715019 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.923078 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77f76d465c-nhgvb"] Feb 17 14:23:07 crc kubenswrapper[4762]: I0217 14:23:07.959221 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm"] Feb 17 14:23:07 crc kubenswrapper[4762]: W0217 14:23:07.966216 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676a0670_76e5_4a67_8afc_9e69c1561f26.slice/crio-402e149437d8cafea3de1f10f95422921e52c7b902a6423fae67bcbc2abdf724 WatchSource:0}: Error finding container 402e149437d8cafea3de1f10f95422921e52c7b902a6423fae67bcbc2abdf724: Status 404 returned error can't find the container with id 402e149437d8cafea3de1f10f95422921e52c7b902a6423fae67bcbc2abdf724 Feb 17 14:23:08 crc kubenswrapper[4762]: I0217 14:23:08.567091 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" event={"ID":"676a0670-76e5-4a67-8afc-9e69c1561f26","Type":"ContainerStarted","Data":"402e149437d8cafea3de1f10f95422921e52c7b902a6423fae67bcbc2abdf724"} Feb 17 14:23:08 crc kubenswrapper[4762]: I0217 14:23:08.568066 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" event={"ID":"1a3455d0-6909-41ab-9c83-f5a96c9858d1","Type":"ContainerStarted","Data":"e0fdcc209972b64cc3b8bf923ebfd7d3ab5f0c7793404f75609290ae00b39655"} Feb 17 14:23:08 crc kubenswrapper[4762]: I0217 14:23:08.569599 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f76d465c-nhgvb" event={"ID":"5d85da49-7985-429f-b4ed-d81ab921b28a","Type":"ContainerStarted","Data":"c8fb48ad1878b5889f3ee2586929930c5c785db1918e85937bc99df92ef018b4"} Feb 17 14:23:08 crc kubenswrapper[4762]: I0217 14:23:08.569742 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f76d465c-nhgvb" event={"ID":"5d85da49-7985-429f-b4ed-d81ab921b28a","Type":"ContainerStarted","Data":"6f079a9d76ae9386818de75c547d45d1d76615870bd301de638e01b7863c2120"} Feb 17 14:23:08 crc kubenswrapper[4762]: I0217 14:23:08.570794 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" event={"ID":"d8c030bf-f09b-4f2d-9db7-b167348f912f","Type":"ContainerStarted","Data":"48f1aa826140801a10fb2b7ddb4a5cf0b47d9138777d9c6b141754f6df508cef"} Feb 17 14:23:10 crc kubenswrapper[4762]: I0217 14:23:10.102692 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77f76d465c-nhgvb" podStartSLOduration=3.102675104 podStartE2EDuration="3.102675104s" podCreationTimestamp="2026-02-17 14:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:23:08.591244984 +0000 UTC m=+1069.171245636" watchObservedRunningTime="2026-02-17 14:23:10.102675104 +0000 UTC m=+1070.682675756" Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.596712 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" event={"ID":"1a3455d0-6909-41ab-9c83-f5a96c9858d1","Type":"ContainerStarted","Data":"d5a2d11e372aadc2732355f7966f357d13d9fa792f8f997f5c019addbbea0c6e"} Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.597141 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.599225 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-chbj9" event={"ID":"384f1796-2d88-476c-be59-1abc8ee06efb","Type":"ContainerStarted","Data":"0f2d92d8d0fd2fe272ff357831b0c69cf0606005c1e0249de6055c4a5863649c"} Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.599921 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.601556 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" event={"ID":"d8c030bf-f09b-4f2d-9db7-b167348f912f","Type":"ContainerStarted","Data":"a57d5611fa9cc7c2f14d7ba7fa199d32fc786bbc066ddd2ddbcf502eb661fe60"} Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.603112 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" event={"ID":"676a0670-76e5-4a67-8afc-9e69c1561f26","Type":"ContainerStarted","Data":"6a53c3f2dc7b513c1aaeaf8f2069f8086a9efc3c26048803eeeaccb71126d8bc"} Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.721859 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" podStartSLOduration=2.174585733 podStartE2EDuration="5.721838733s" podCreationTimestamp="2026-02-17 14:23:06 +0000 UTC" firstStartedPulling="2026-02-17 14:23:07.565137131 +0000 UTC m=+1068.145137783" lastFinishedPulling="2026-02-17 14:23:11.112390131 +0000 UTC m=+1071.692390783" observedRunningTime="2026-02-17 14:23:11.716796536 +0000 UTC m=+1072.296797188" watchObservedRunningTime="2026-02-17 14:23:11.721838733 +0000 UTC m=+1072.301839385" Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.769360 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-chbj9" podStartSLOduration=1.717632541 podStartE2EDuration="5.76933735s" podCreationTimestamp="2026-02-17 14:23:06 +0000 UTC" firstStartedPulling="2026-02-17 14:23:07.079861866 +0000 UTC m=+1067.659862518" lastFinishedPulling="2026-02-17 14:23:11.131566685 +0000 UTC m=+1071.711567327" observedRunningTime="2026-02-17 14:23:11.765951387 +0000 UTC m=+1072.345952039" watchObservedRunningTime="2026-02-17 14:23:11.76933735 +0000 UTC m=+1072.349338002" Feb 17 14:23:11 crc kubenswrapper[4762]: I0217 14:23:11.770850 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-mwkcm" podStartSLOduration=2.6383837100000003 podStartE2EDuration="5.770841721s" podCreationTimestamp="2026-02-17 14:23:06 +0000 UTC" firstStartedPulling="2026-02-17 14:23:07.968273562 +0000 UTC m=+1068.548274224" lastFinishedPulling="2026-02-17 14:23:11.100731573 +0000 UTC m=+1071.680732235" observedRunningTime="2026-02-17 14:23:11.750032923 +0000 UTC m=+1072.330033575" watchObservedRunningTime="2026-02-17 14:23:11.770841721 +0000 UTC m=+1072.350842373" Feb 17 14:23:14 crc kubenswrapper[4762]: I0217 14:23:14.627985 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" event={"ID":"d8c030bf-f09b-4f2d-9db7-b167348f912f","Type":"ContainerStarted","Data":"0928b612c020e41b90a52b9a98a77fb78fa463fa2c9cad6c58ebf1c3960345ee"} Feb 17 14:23:14 crc kubenswrapper[4762]: I0217 14:23:14.652698 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-pg2bv" podStartSLOduration=2.014672579 podStartE2EDuration="8.652679472s" podCreationTimestamp="2026-02-17 14:23:06 +0000 UTC" firstStartedPulling="2026-02-17 14:23:07.656131404 +0000 UTC m=+1068.236132056" lastFinishedPulling="2026-02-17 14:23:14.294138287 +0000 UTC m=+1074.874138949" observedRunningTime="2026-02-17 14:23:14.647662025 +0000 UTC m=+1075.227662687" watchObservedRunningTime="2026-02-17 14:23:14.652679472 +0000 UTC m=+1075.232680114" Feb 17 14:23:16 crc kubenswrapper[4762]: I0217 14:23:16.996813 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-chbj9" Feb 17 14:23:17 crc kubenswrapper[4762]: I0217 14:23:17.470310 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:17 crc kubenswrapper[4762]: I0217 14:23:17.470700 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:17 crc kubenswrapper[4762]: I0217 14:23:17.475141 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:17 crc kubenswrapper[4762]: I0217 14:23:17.657673 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:23:17 crc kubenswrapper[4762]: I0217 14:23:17.727162 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c5f45bcb-954rj"] Feb 17 14:23:24 crc kubenswrapper[4762]: I0217 14:23:24.639408 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:23:24 crc kubenswrapper[4762]: I0217 14:23:24.639900 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:23:26 crc kubenswrapper[4762]: I0217 14:23:26.930098 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tlsn7" Feb 17 14:23:42 crc kubenswrapper[4762]: I0217 14:23:42.791091 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-86c5f45bcb-954rj" podUID="36ae5bb3-63ce-4c9e-a891-c83b6ff22576" containerName="console" containerID="cri-o://e2a227c620335e07b393b55093cee34504975bfcf2184304a2b4a6d8f1adcc33" gracePeriod=15 Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.193814 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6"] Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.196962 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.199399 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.206859 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6"] Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.280106 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c5f45bcb-954rj_36ae5bb3-63ce-4c9e-a891-c83b6ff22576/console/0.log" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.280148 4762 generic.go:334] "Generic (PLEG): container finished" podID="36ae5bb3-63ce-4c9e-a891-c83b6ff22576" containerID="e2a227c620335e07b393b55093cee34504975bfcf2184304a2b4a6d8f1adcc33" exitCode=2 Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.280175 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c5f45bcb-954rj" event={"ID":"36ae5bb3-63ce-4c9e-a891-c83b6ff22576","Type":"ContainerDied","Data":"e2a227c620335e07b393b55093cee34504975bfcf2184304a2b4a6d8f1adcc33"} Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.311227 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.311274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxczt\" (UniqueName: \"kubernetes.io/projected/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-kube-api-access-sxczt\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.311361 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.413346 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.413483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.413516 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxczt\" (UniqueName: \"kubernetes.io/projected/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-kube-api-access-sxczt\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.422216 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.424042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.443000 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxczt\" (UniqueName: \"kubernetes.io/projected/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-kube-api-access-sxczt\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.492446 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c5f45bcb-954rj_36ae5bb3-63ce-4c9e-a891-c83b6ff22576/console/0.log" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.492713 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.533334 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.615904 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-oauth-serving-cert\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.615972 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-trusted-ca-bundle\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.615997 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-service-ca\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.616036 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-serving-cert\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.616056 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz6s2\" (UniqueName: \"kubernetes.io/projected/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-kube-api-access-bz6s2\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.616073 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-config\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.616121 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-oauth-config\") pod \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\" (UID: \"36ae5bb3-63ce-4c9e-a891-c83b6ff22576\") " Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.617331 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.617356 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.617337 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-config" (OuterVolumeSpecName: "console-config") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.617896 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-service-ca" (OuterVolumeSpecName: "service-ca") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.621189 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.621218 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-kube-api-access-bz6s2" (OuterVolumeSpecName: "kube-api-access-bz6s2") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "kube-api-access-bz6s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.621209 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "36ae5bb3-63ce-4c9e-a891-c83b6ff22576" (UID: "36ae5bb3-63ce-4c9e-a891-c83b6ff22576"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718334 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718379 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718391 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718402 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718414 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz6s2\" (UniqueName: \"kubernetes.io/projected/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-kube-api-access-bz6s2\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718426 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.718437 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36ae5bb3-63ce-4c9e-a891-c83b6ff22576-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:43 crc kubenswrapper[4762]: I0217 14:23:43.992090 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6"] Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.287984 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c5f45bcb-954rj_36ae5bb3-63ce-4c9e-a891-c83b6ff22576/console/0.log" Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.288100 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c5f45bcb-954rj" Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.288693 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c5f45bcb-954rj" event={"ID":"36ae5bb3-63ce-4c9e-a891-c83b6ff22576","Type":"ContainerDied","Data":"69c080d6e7ce862c43827c5762e2241dbb82a1455b0e858be45d7c62cfe62c6b"} Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.288730 4762 scope.go:117] "RemoveContainer" containerID="e2a227c620335e07b393b55093cee34504975bfcf2184304a2b4a6d8f1adcc33" Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.290282 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" event={"ID":"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f","Type":"ContainerStarted","Data":"54d320d9cbfd34182dea532d776f2f433cd578d7c77d3f5d76604cddf91f4c41"} Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.290327 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" event={"ID":"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f","Type":"ContainerStarted","Data":"6e0ae4b36322fc687d16ffb4731086018be32dc3fb025c8c4839fa8eefd6c637"} Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.310870 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c5f45bcb-954rj"] Feb 17 14:23:44 crc kubenswrapper[4762]: I0217 14:23:44.316826 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86c5f45bcb-954rj"] Feb 17 14:23:44 crc kubenswrapper[4762]: E0217 14:23:44.660973 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf00bbd70_901c_4a63_a6b4_ca6a97f6df6f.slice/crio-54d320d9cbfd34182dea532d776f2f433cd578d7c77d3f5d76604cddf91f4c41.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:23:45 crc kubenswrapper[4762]: I0217 14:23:45.298952 4762 generic.go:334] "Generic (PLEG): container finished" podID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerID="54d320d9cbfd34182dea532d776f2f433cd578d7c77d3f5d76604cddf91f4c41" exitCode=0 Feb 17 14:23:45 crc kubenswrapper[4762]: I0217 14:23:45.299003 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" event={"ID":"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f","Type":"ContainerDied","Data":"54d320d9cbfd34182dea532d776f2f433cd578d7c77d3f5d76604cddf91f4c41"} Feb 17 14:23:46 crc kubenswrapper[4762]: I0217 14:23:46.079634 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ae5bb3-63ce-4c9e-a891-c83b6ff22576" path="/var/lib/kubelet/pods/36ae5bb3-63ce-4c9e-a891-c83b6ff22576/volumes" Feb 17 14:23:47 crc kubenswrapper[4762]: I0217 14:23:47.313936 4762 generic.go:334] "Generic (PLEG): container finished" podID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerID="ca1c22a17e06f3fc39af85a346a5bea81176eef8287104a2b8c5d5a61e603bf6" exitCode=0 Feb 17 14:23:47 crc kubenswrapper[4762]: I0217 14:23:47.314035 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" event={"ID":"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f","Type":"ContainerDied","Data":"ca1c22a17e06f3fc39af85a346a5bea81176eef8287104a2b8c5d5a61e603bf6"} Feb 17 14:23:48 crc kubenswrapper[4762]: I0217 14:23:48.323904 4762 generic.go:334] "Generic (PLEG): container finished" podID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerID="44e837ec034b262157a0d5e6b2350f980e76b0a1be4ca557f1d90fdfe5349d98" exitCode=0 Feb 17 14:23:48 crc kubenswrapper[4762]: I0217 14:23:48.324268 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" event={"ID":"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f","Type":"ContainerDied","Data":"44e837ec034b262157a0d5e6b2350f980e76b0a1be4ca557f1d90fdfe5349d98"} Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.644094 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.704308 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-bundle\") pod \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.704464 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxczt\" (UniqueName: \"kubernetes.io/projected/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-kube-api-access-sxczt\") pod \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.704507 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-util\") pod \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\" (UID: \"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f\") " Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.705985 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-bundle" (OuterVolumeSpecName: "bundle") pod "f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" (UID: "f00bbd70-901c-4a63-a6b4-ca6a97f6df6f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.713287 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-kube-api-access-sxczt" (OuterVolumeSpecName: "kube-api-access-sxczt") pod "f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" (UID: "f00bbd70-901c-4a63-a6b4-ca6a97f6df6f"). InnerVolumeSpecName "kube-api-access-sxczt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.717601 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-util" (OuterVolumeSpecName: "util") pod "f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" (UID: "f00bbd70-901c-4a63-a6b4-ca6a97f6df6f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.806382 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.806425 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxczt\" (UniqueName: \"kubernetes.io/projected/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-kube-api-access-sxczt\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:49 crc kubenswrapper[4762]: I0217 14:23:49.806439 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f00bbd70-901c-4a63-a6b4-ca6a97f6df6f-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:50 crc kubenswrapper[4762]: I0217 14:23:50.339398 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" Feb 17 14:23:50 crc kubenswrapper[4762]: I0217 14:23:50.339280 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6" event={"ID":"f00bbd70-901c-4a63-a6b4-ca6a97f6df6f","Type":"ContainerDied","Data":"6e0ae4b36322fc687d16ffb4731086018be32dc3fb025c8c4839fa8eefd6c637"} Feb 17 14:23:50 crc kubenswrapper[4762]: I0217 14:23:50.339461 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e0ae4b36322fc687d16ffb4731086018be32dc3fb025c8c4839fa8eefd6c637" Feb 17 14:23:54 crc kubenswrapper[4762]: I0217 14:23:54.621130 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:23:54 crc kubenswrapper[4762]: I0217 14:23:54.621700 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.883616 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5"] Feb 17 14:23:57 crc kubenswrapper[4762]: E0217 14:23:57.884235 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="pull" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.884248 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="pull" Feb 17 14:23:57 crc kubenswrapper[4762]: E0217 14:23:57.884269 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="extract" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.884276 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="extract" Feb 17 14:23:57 crc kubenswrapper[4762]: E0217 14:23:57.884293 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="util" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.884301 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="util" Feb 17 14:23:57 crc kubenswrapper[4762]: E0217 14:23:57.884320 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ae5bb3-63ce-4c9e-a891-c83b6ff22576" containerName="console" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.884328 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ae5bb3-63ce-4c9e-a891-c83b6ff22576" containerName="console" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.884456 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00bbd70-901c-4a63-a6b4-ca6a97f6df6f" containerName="extract" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.884467 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ae5bb3-63ce-4c9e-a891-c83b6ff22576" containerName="console" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.885033 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.887044 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.887287 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.887310 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.887561 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hlknz" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.888111 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.905023 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5"] Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.949241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6db\" (UniqueName: \"kubernetes.io/projected/ecb19ca9-7000-48bf-b390-37343271ee18-kube-api-access-9f6db\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.949285 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecb19ca9-7000-48bf-b390-37343271ee18-apiservice-cert\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:57 crc kubenswrapper[4762]: I0217 14:23:57.949368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecb19ca9-7000-48bf-b390-37343271ee18-webhook-cert\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.050512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6db\" (UniqueName: \"kubernetes.io/projected/ecb19ca9-7000-48bf-b390-37343271ee18-kube-api-access-9f6db\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.050568 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecb19ca9-7000-48bf-b390-37343271ee18-apiservice-cert\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.050635 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecb19ca9-7000-48bf-b390-37343271ee18-webhook-cert\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.058597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecb19ca9-7000-48bf-b390-37343271ee18-apiservice-cert\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.058854 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecb19ca9-7000-48bf-b390-37343271ee18-webhook-cert\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.077846 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6db\" (UniqueName: \"kubernetes.io/projected/ecb19ca9-7000-48bf-b390-37343271ee18-kube-api-access-9f6db\") pod \"metallb-operator-controller-manager-55bbdb8f74-wdnm5\" (UID: \"ecb19ca9-7000-48bf-b390-37343271ee18\") " pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.156416 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796"] Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.157395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.159237 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qksck" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.160343 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.160690 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.172706 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796"] Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.200256 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.253308 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3838870d-4c8c-4055-a512-454c8d7bf205-webhook-cert\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.253408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfx4p\" (UniqueName: \"kubernetes.io/projected/3838870d-4c8c-4055-a512-454c8d7bf205-kube-api-access-cfx4p\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.253457 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3838870d-4c8c-4055-a512-454c8d7bf205-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.355496 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfx4p\" (UniqueName: \"kubernetes.io/projected/3838870d-4c8c-4055-a512-454c8d7bf205-kube-api-access-cfx4p\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.355593 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3838870d-4c8c-4055-a512-454c8d7bf205-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.355741 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3838870d-4c8c-4055-a512-454c8d7bf205-webhook-cert\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.360336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3838870d-4c8c-4055-a512-454c8d7bf205-webhook-cert\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.363287 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3838870d-4c8c-4055-a512-454c8d7bf205-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.384323 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfx4p\" (UniqueName: \"kubernetes.io/projected/3838870d-4c8c-4055-a512-454c8d7bf205-kube-api-access-cfx4p\") pod \"metallb-operator-webhook-server-6cf86c5464-wt796\" (UID: \"3838870d-4c8c-4055-a512-454c8d7bf205\") " pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:58 crc kubenswrapper[4762]: I0217 14:23:58.474772 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:23:59 crc kubenswrapper[4762]: I0217 14:23:59.238201 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5"] Feb 17 14:23:59 crc kubenswrapper[4762]: I0217 14:23:59.342957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796"] Feb 17 14:23:59 crc kubenswrapper[4762]: I0217 14:23:59.414312 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" event={"ID":"3838870d-4c8c-4055-a512-454c8d7bf205","Type":"ContainerStarted","Data":"2f3fb1df814502c7d8f0079814ad4b9781446f8835ef68467f4b3e93636c7659"} Feb 17 14:23:59 crc kubenswrapper[4762]: I0217 14:23:59.415829 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" event={"ID":"ecb19ca9-7000-48bf-b390-37343271ee18","Type":"ContainerStarted","Data":"ea72f8e352876454f46b8881b3029f3c03d9fb958465db08edd99853902f37c4"} Feb 17 14:24:07 crc kubenswrapper[4762]: E0217 14:24:07.075850 4762 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.005s" Feb 17 14:24:09 crc kubenswrapper[4762]: I0217 14:24:09.137922 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" event={"ID":"ecb19ca9-7000-48bf-b390-37343271ee18","Type":"ContainerStarted","Data":"e65b0eccd4f840272394f20b4b0a0cdd1d5f6a1ea39131bf44357066d29bcdb8"} Feb 17 14:24:09 crc kubenswrapper[4762]: I0217 14:24:09.139585 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:24:09 crc kubenswrapper[4762]: I0217 14:24:09.141436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" event={"ID":"3838870d-4c8c-4055-a512-454c8d7bf205","Type":"ContainerStarted","Data":"ec94808a6b0fed4c5b5b91e2b674a70835a0091b25814a3ec6bebc2b882e1b8d"} Feb 17 14:24:09 crc kubenswrapper[4762]: I0217 14:24:09.141974 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:24:09 crc kubenswrapper[4762]: I0217 14:24:09.412352 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" podStartSLOduration=2.884605003 podStartE2EDuration="12.41232823s" podCreationTimestamp="2026-02-17 14:23:57 +0000 UTC" firstStartedPulling="2026-02-17 14:23:59.251974617 +0000 UTC m=+1119.831975269" lastFinishedPulling="2026-02-17 14:24:08.779697844 +0000 UTC m=+1129.359698496" observedRunningTime="2026-02-17 14:24:09.40537687 +0000 UTC m=+1129.985377532" watchObservedRunningTime="2026-02-17 14:24:09.41232823 +0000 UTC m=+1129.992328882" Feb 17 14:24:18 crc kubenswrapper[4762]: I0217 14:24:18.580016 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" Feb 17 14:24:18 crc kubenswrapper[4762]: I0217 14:24:18.647600 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6cf86c5464-wt796" podStartSLOduration=11.205525158 podStartE2EDuration="20.647583537s" podCreationTimestamp="2026-02-17 14:23:58 +0000 UTC" firstStartedPulling="2026-02-17 14:23:59.349992892 +0000 UTC m=+1119.929993544" lastFinishedPulling="2026-02-17 14:24:08.792051271 +0000 UTC m=+1129.372051923" observedRunningTime="2026-02-17 14:24:09.452688321 +0000 UTC m=+1130.032688973" watchObservedRunningTime="2026-02-17 14:24:18.647583537 +0000 UTC m=+1139.227584189" Feb 17 14:24:24 crc kubenswrapper[4762]: I0217 14:24:24.621907 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:24:24 crc kubenswrapper[4762]: I0217 14:24:24.622435 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:24:24 crc kubenswrapper[4762]: I0217 14:24:24.622480 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:24:24 crc kubenswrapper[4762]: I0217 14:24:24.623119 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccc577972b61cd413548bab4efa2b49055d0a18dd9858698cc28b4b73b495bf9"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:24:24 crc kubenswrapper[4762]: I0217 14:24:24.623172 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://ccc577972b61cd413548bab4efa2b49055d0a18dd9858698cc28b4b73b495bf9" gracePeriod=600 Feb 17 14:24:25 crc kubenswrapper[4762]: I0217 14:24:25.091675 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="ccc577972b61cd413548bab4efa2b49055d0a18dd9858698cc28b4b73b495bf9" exitCode=0 Feb 17 14:24:25 crc kubenswrapper[4762]: I0217 14:24:25.091722 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"ccc577972b61cd413548bab4efa2b49055d0a18dd9858698cc28b4b73b495bf9"} Feb 17 14:24:25 crc kubenswrapper[4762]: I0217 14:24:25.092051 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"1f57f792acac65c40f56a21d9846b71db555cf9b18e70e6ffc6202b1c323fd44"} Feb 17 14:24:25 crc kubenswrapper[4762]: I0217 14:24:25.092077 4762 scope.go:117] "RemoveContainer" containerID="a30a93d238cea1f8adefd72afd175112649379fa52475b885f21fda62dbe2cba" Feb 17 14:24:38 crc kubenswrapper[4762]: I0217 14:24:38.204454 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-55bbdb8f74-wdnm5" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.128616 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7"] Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.129949 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.132337 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tqb29" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.133325 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.136812 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7"] Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.145610 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kmqrr"] Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.150660 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.154008 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.154785 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.243344 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-w6fdr"] Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.244859 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.248679 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.249120 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-z5ngf" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.249371 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.249478 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.254348 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-fblcw"] Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.256264 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.258763 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.272550 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-fblcw"] Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275063 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8ff3f905-182a-4670-9789-efea7744fa7a-frr-startup\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb14da33-81db-4b59-8325-af90620744fe-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275165 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztt8w\" (UniqueName: \"kubernetes.io/projected/eb14da33-81db-4b59-8325-af90620744fe-kube-api-access-ztt8w\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-reloader\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275236 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5d8m\" (UniqueName: \"kubernetes.io/projected/8ff3f905-182a-4670-9789-efea7744fa7a-kube-api-access-x5d8m\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275283 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff3f905-182a-4670-9789-efea7744fa7a-metrics-certs\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275304 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-frr-conf\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275337 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-frr-sockets\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.275358 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-metrics\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.376625 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37a158f-5b24-474c-9405-fc86bef30818-cert\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377276 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrkv\" (UniqueName: \"kubernetes.io/projected/e37a158f-5b24-474c-9405-fc86bef30818-kube-api-access-gqrkv\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377311 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-metrics-certs\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377363 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff3f905-182a-4670-9789-efea7744fa7a-metrics-certs\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-frr-conf\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377440 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-frr-sockets\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377463 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-metrics\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377912 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grtvx\" (UniqueName: \"kubernetes.io/projected/89cf356f-3fde-40db-9749-8f0bd5f61407-kube-api-access-grtvx\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.377978 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8ff3f905-182a-4670-9789-efea7744fa7a-frr-startup\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378036 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37a158f-5b24-474c-9405-fc86bef30818-metrics-certs\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378081 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb14da33-81db-4b59-8325-af90620744fe-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztt8w\" (UniqueName: \"kubernetes.io/projected/eb14da33-81db-4b59-8325-af90620744fe-kube-api-access-ztt8w\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378179 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-frr-conf\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378209 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-reloader\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378529 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/89cf356f-3fde-40db-9749-8f0bd5f61407-metallb-excludel2\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: E0217 14:24:39.378311 4762 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378573 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5d8m\" (UniqueName: \"kubernetes.io/projected/8ff3f905-182a-4670-9789-efea7744fa7a-kube-api-access-x5d8m\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: E0217 14:24:39.378611 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb14da33-81db-4b59-8325-af90620744fe-cert podName:eb14da33-81db-4b59-8325-af90620744fe nodeName:}" failed. No retries permitted until 2026-02-17 14:24:39.878593173 +0000 UTC m=+1160.458593825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb14da33-81db-4b59-8325-af90620744fe-cert") pod "frr-k8s-webhook-server-78b44bf5bb-bd9n7" (UID: "eb14da33-81db-4b59-8325-af90620744fe") : secret "frr-k8s-webhook-server-cert" not found Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378613 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-metrics\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.378471 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-reloader\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.379174 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8ff3f905-182a-4670-9789-efea7744fa7a-frr-startup\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.379230 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8ff3f905-182a-4670-9789-efea7744fa7a-frr-sockets\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.397291 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff3f905-182a-4670-9789-efea7744fa7a-metrics-certs\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.400483 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5d8m\" (UniqueName: \"kubernetes.io/projected/8ff3f905-182a-4670-9789-efea7744fa7a-kube-api-access-x5d8m\") pod \"frr-k8s-kmqrr\" (UID: \"8ff3f905-182a-4670-9789-efea7744fa7a\") " pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.402551 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztt8w\" (UniqueName: \"kubernetes.io/projected/eb14da33-81db-4b59-8325-af90620744fe-kube-api-access-ztt8w\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.470268 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.510412 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37a158f-5b24-474c-9405-fc86bef30818-metrics-certs\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511055 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/89cf356f-3fde-40db-9749-8f0bd5f61407-metallb-excludel2\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511209 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37a158f-5b24-474c-9405-fc86bef30818-cert\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrkv\" (UniqueName: \"kubernetes.io/projected/e37a158f-5b24-474c-9405-fc86bef30818-kube-api-access-gqrkv\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511437 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-metrics-certs\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grtvx\" (UniqueName: \"kubernetes.io/projected/89cf356f-3fde-40db-9749-8f0bd5f61407-kube-api-access-grtvx\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.511962 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/89cf356f-3fde-40db-9749-8f0bd5f61407-metallb-excludel2\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: E0217 14:24:39.512040 4762 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 17 14:24:39 crc kubenswrapper[4762]: E0217 14:24:39.512264 4762 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 14:24:39 crc kubenswrapper[4762]: E0217 14:24:39.512414 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist podName:89cf356f-3fde-40db-9749-8f0bd5f61407 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:40.012395775 +0000 UTC m=+1160.592396427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist") pod "speaker-w6fdr" (UID: "89cf356f-3fde-40db-9749-8f0bd5f61407") : secret "metallb-memberlist" not found Feb 17 14:24:39 crc kubenswrapper[4762]: E0217 14:24:39.513315 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-metrics-certs podName:89cf356f-3fde-40db-9749-8f0bd5f61407 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:40.01330179 +0000 UTC m=+1160.593302442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-metrics-certs") pod "speaker-w6fdr" (UID: "89cf356f-3fde-40db-9749-8f0bd5f61407") : secret "speaker-certs-secret" not found Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.516257 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.523907 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e37a158f-5b24-474c-9405-fc86bef30818-metrics-certs\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.524986 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e37a158f-5b24-474c-9405-fc86bef30818-cert\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.539464 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grtvx\" (UniqueName: \"kubernetes.io/projected/89cf356f-3fde-40db-9749-8f0bd5f61407-kube-api-access-grtvx\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.551770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrkv\" (UniqueName: \"kubernetes.io/projected/e37a158f-5b24-474c-9405-fc86bef30818-kube-api-access-gqrkv\") pod \"controller-69bbfbf88f-fblcw\" (UID: \"e37a158f-5b24-474c-9405-fc86bef30818\") " pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.603062 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.919295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb14da33-81db-4b59-8325-af90620744fe-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:39 crc kubenswrapper[4762]: I0217 14:24:39.923849 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb14da33-81db-4b59-8325-af90620744fe-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bd9n7\" (UID: \"eb14da33-81db-4b59-8325-af90620744fe\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.020549 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:40 crc kubenswrapper[4762]: E0217 14:24:40.020747 4762 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.020755 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-metrics-certs\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:40 crc kubenswrapper[4762]: E0217 14:24:40.020813 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist podName:89cf356f-3fde-40db-9749-8f0bd5f61407 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:41.02079432 +0000 UTC m=+1161.600794972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist") pod "speaker-w6fdr" (UID: "89cf356f-3fde-40db-9749-8f0bd5f61407") : secret "metallb-memberlist" not found Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.023830 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-metrics-certs\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.048485 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.090282 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-fblcw"] Feb 17 14:24:40 crc kubenswrapper[4762]: W0217 14:24:40.094181 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37a158f_5b24_474c_9405_fc86bef30818.slice/crio-6750ae2d90bd8240e40aca77aacba88d214790aed8acc4888c71612a54a3c30a WatchSource:0}: Error finding container 6750ae2d90bd8240e40aca77aacba88d214790aed8acc4888c71612a54a3c30a: Status 404 returned error can't find the container with id 6750ae2d90bd8240e40aca77aacba88d214790aed8acc4888c71612a54a3c30a Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.230778 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"d23f158686ecacc4091316f468f94afbdddbb253d32f324eaf29a10275987159"} Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.232028 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-fblcw" event={"ID":"e37a158f-5b24-474c-9405-fc86bef30818","Type":"ContainerStarted","Data":"6750ae2d90bd8240e40aca77aacba88d214790aed8acc4888c71612a54a3c30a"} Feb 17 14:24:40 crc kubenswrapper[4762]: I0217 14:24:40.521420 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7"] Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.044814 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.051035 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89cf356f-3fde-40db-9749-8f0bd5f61407-memberlist\") pod \"speaker-w6fdr\" (UID: \"89cf356f-3fde-40db-9749-8f0bd5f61407\") " pod="metallb-system/speaker-w6fdr" Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.064918 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w6fdr" Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.240680 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w6fdr" event={"ID":"89cf356f-3fde-40db-9749-8f0bd5f61407","Type":"ContainerStarted","Data":"64b6d965bd2ae4abedb797b17618a9a5c698a9d4902f0e15a21460e3ab6a010e"} Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.242112 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" event={"ID":"eb14da33-81db-4b59-8325-af90620744fe","Type":"ContainerStarted","Data":"623a12369f0b0e5843be51ba65409112e292c902333ef876aea33a7d61229df0"} Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.243534 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-fblcw" event={"ID":"e37a158f-5b24-474c-9405-fc86bef30818","Type":"ContainerStarted","Data":"cb20590fecc88df8fb68f2222cb72d454bf0ab0edd75e38f18415dfc55e749e1"} Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.243574 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-fblcw" event={"ID":"e37a158f-5b24-474c-9405-fc86bef30818","Type":"ContainerStarted","Data":"714f3e78d34db6a9b1ff8a1196da4592ca67419825716e0e2fd9d10b99eca5da"} Feb 17 14:24:41 crc kubenswrapper[4762]: I0217 14:24:41.244009 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:42 crc kubenswrapper[4762]: I0217 14:24:42.266907 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w6fdr" event={"ID":"89cf356f-3fde-40db-9749-8f0bd5f61407","Type":"ContainerStarted","Data":"7e903a198047d7d9d0ce1539d4c9348c138a76fafc912eed0aa4513215084fd4"} Feb 17 14:24:42 crc kubenswrapper[4762]: I0217 14:24:42.267303 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w6fdr" event={"ID":"89cf356f-3fde-40db-9749-8f0bd5f61407","Type":"ContainerStarted","Data":"49dc559be7b3198b2e6873434fd7955aed559a8d358f8b0e3b7bebd1912f324c"} Feb 17 14:24:42 crc kubenswrapper[4762]: I0217 14:24:42.267672 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w6fdr" Feb 17 14:24:42 crc kubenswrapper[4762]: I0217 14:24:42.322526 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-fblcw" podStartSLOduration=3.322501737 podStartE2EDuration="3.322501737s" podCreationTimestamp="2026-02-17 14:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:24:41.262231151 +0000 UTC m=+1161.842231823" watchObservedRunningTime="2026-02-17 14:24:42.322501737 +0000 UTC m=+1162.902502389" Feb 17 14:24:42 crc kubenswrapper[4762]: I0217 14:24:42.329010 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-w6fdr" podStartSLOduration=3.328982354 podStartE2EDuration="3.328982354s" podCreationTimestamp="2026-02-17 14:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:24:42.316332729 +0000 UTC m=+1162.896333381" watchObservedRunningTime="2026-02-17 14:24:42.328982354 +0000 UTC m=+1162.908983006" Feb 17 14:24:51 crc kubenswrapper[4762]: I0217 14:24:51.069779 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w6fdr" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.372052 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ts8xq"] Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.376195 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.386050 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ts8xq"] Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.386564 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j445z" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.386618 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.386732 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.528075 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pp9b\" (UniqueName: \"kubernetes.io/projected/18c06222-8721-4d45-aeb5-ea93bab1ea85-kube-api-access-8pp9b\") pod \"openstack-operator-index-ts8xq\" (UID: \"18c06222-8721-4d45-aeb5-ea93bab1ea85\") " pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.630417 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pp9b\" (UniqueName: \"kubernetes.io/projected/18c06222-8721-4d45-aeb5-ea93bab1ea85-kube-api-access-8pp9b\") pod \"openstack-operator-index-ts8xq\" (UID: \"18c06222-8721-4d45-aeb5-ea93bab1ea85\") " pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.653698 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pp9b\" (UniqueName: \"kubernetes.io/projected/18c06222-8721-4d45-aeb5-ea93bab1ea85-kube-api-access-8pp9b\") pod \"openstack-operator-index-ts8xq\" (UID: \"18c06222-8721-4d45-aeb5-ea93bab1ea85\") " pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:24:55 crc kubenswrapper[4762]: I0217 14:24:55.705704 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:24:56 crc kubenswrapper[4762]: I0217 14:24:56.436805 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ts8xq"] Feb 17 14:24:56 crc kubenswrapper[4762]: I0217 14:24:56.470864 4762 generic.go:334] "Generic (PLEG): container finished" podID="8ff3f905-182a-4670-9789-efea7744fa7a" containerID="70a91b6b3b18e643c4bf568641deeb0f4168b0b84d1d699bb022cf68496f2277" exitCode=0 Feb 17 14:24:56 crc kubenswrapper[4762]: I0217 14:24:56.471244 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerDied","Data":"70a91b6b3b18e643c4bf568641deeb0f4168b0b84d1d699bb022cf68496f2277"} Feb 17 14:24:56 crc kubenswrapper[4762]: I0217 14:24:56.473742 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" event={"ID":"eb14da33-81db-4b59-8325-af90620744fe","Type":"ContainerStarted","Data":"b0442dbd398742ab8f1efb541e400778ada573f0fee78d4a62042dd9b73a9fed"} Feb 17 14:24:56 crc kubenswrapper[4762]: I0217 14:24:56.473870 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:24:56 crc kubenswrapper[4762]: I0217 14:24:56.474879 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ts8xq" event={"ID":"18c06222-8721-4d45-aeb5-ea93bab1ea85","Type":"ContainerStarted","Data":"6052eac3763b6a5c4f83c06cfdfdaf9a92b6a96a424c26d98635e58121bd3f8e"} Feb 17 14:24:57 crc kubenswrapper[4762]: I0217 14:24:57.484446 4762 generic.go:334] "Generic (PLEG): container finished" podID="8ff3f905-182a-4670-9789-efea7744fa7a" containerID="93d8f5abced17f6c163ec3516770a3bb3c005c3ad65be43feefec01a58c06892" exitCode=0 Feb 17 14:24:57 crc kubenswrapper[4762]: I0217 14:24:57.484500 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerDied","Data":"93d8f5abced17f6c163ec3516770a3bb3c005c3ad65be43feefec01a58c06892"} Feb 17 14:24:57 crc kubenswrapper[4762]: I0217 14:24:57.511935 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" podStartSLOduration=2.963520951 podStartE2EDuration="18.511908634s" podCreationTimestamp="2026-02-17 14:24:39 +0000 UTC" firstStartedPulling="2026-02-17 14:24:40.517158987 +0000 UTC m=+1161.097159639" lastFinishedPulling="2026-02-17 14:24:56.06554667 +0000 UTC m=+1176.645547322" observedRunningTime="2026-02-17 14:24:56.50998782 +0000 UTC m=+1177.089988472" watchObservedRunningTime="2026-02-17 14:24:57.511908634 +0000 UTC m=+1178.091909286" Feb 17 14:24:58 crc kubenswrapper[4762]: I0217 14:24:58.739545 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ts8xq"] Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.353841 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sh6w6"] Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.354954 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.381090 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sh6w6"] Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.499448 4762 generic.go:334] "Generic (PLEG): container finished" podID="8ff3f905-182a-4670-9789-efea7744fa7a" containerID="1b99e9f5e4f28c7e7ab61b12118ffa049db19ca48fd484ac2148cb8d1d697c4f" exitCode=0 Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.499491 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerDied","Data":"1b99e9f5e4f28c7e7ab61b12118ffa049db19ca48fd484ac2148cb8d1d697c4f"} Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.500808 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ts8xq" event={"ID":"18c06222-8721-4d45-aeb5-ea93bab1ea85","Type":"ContainerStarted","Data":"ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64"} Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.500908 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ts8xq" podUID="18c06222-8721-4d45-aeb5-ea93bab1ea85" containerName="registry-server" containerID="cri-o://ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64" gracePeriod=2 Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.517665 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4l28\" (UniqueName: \"kubernetes.io/projected/f96d5046-7e85-41d7-b333-a5d22ef1e541-kube-api-access-n4l28\") pod \"openstack-operator-index-sh6w6\" (UID: \"f96d5046-7e85-41d7-b333-a5d22ef1e541\") " pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.544261 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ts8xq" podStartSLOduration=1.760651961 podStartE2EDuration="4.54423885s" podCreationTimestamp="2026-02-17 14:24:55 +0000 UTC" firstStartedPulling="2026-02-17 14:24:56.448033279 +0000 UTC m=+1177.028033931" lastFinishedPulling="2026-02-17 14:24:59.231620168 +0000 UTC m=+1179.811620820" observedRunningTime="2026-02-17 14:24:59.541491085 +0000 UTC m=+1180.121491737" watchObservedRunningTime="2026-02-17 14:24:59.54423885 +0000 UTC m=+1180.124239502" Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.609249 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-fblcw" Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.620526 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4l28\" (UniqueName: \"kubernetes.io/projected/f96d5046-7e85-41d7-b333-a5d22ef1e541-kube-api-access-n4l28\") pod \"openstack-operator-index-sh6w6\" (UID: \"f96d5046-7e85-41d7-b333-a5d22ef1e541\") " pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.641112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4l28\" (UniqueName: \"kubernetes.io/projected/f96d5046-7e85-41d7-b333-a5d22ef1e541-kube-api-access-n4l28\") pod \"openstack-operator-index-sh6w6\" (UID: \"f96d5046-7e85-41d7-b333-a5d22ef1e541\") " pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:24:59 crc kubenswrapper[4762]: I0217 14:24:59.682208 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.438417 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.524963 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sh6w6"] Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.526096 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"f98a36e77f4d2588837dec20a743ddeb03454aa8edb9e6a89380e72de71fd381"} Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.526465 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"0923fd6e674c09131acffc3f2d6e70f8e3344e774b8019201c1bb68a2f510591"} Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.528251 4762 generic.go:334] "Generic (PLEG): container finished" podID="18c06222-8721-4d45-aeb5-ea93bab1ea85" containerID="ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64" exitCode=0 Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.528331 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ts8xq" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.528380 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ts8xq" event={"ID":"18c06222-8721-4d45-aeb5-ea93bab1ea85","Type":"ContainerDied","Data":"ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64"} Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.528420 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ts8xq" event={"ID":"18c06222-8721-4d45-aeb5-ea93bab1ea85","Type":"ContainerDied","Data":"6052eac3763b6a5c4f83c06cfdfdaf9a92b6a96a424c26d98635e58121bd3f8e"} Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.528438 4762 scope.go:117] "RemoveContainer" containerID="ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64" Feb 17 14:25:00 crc kubenswrapper[4762]: W0217 14:25:00.533009 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf96d5046_7e85_41d7_b333_a5d22ef1e541.slice/crio-96bd6f6c9e107a50c2e4c07f75b5c472e606ca40079a00c0311893223b520de7 WatchSource:0}: Error finding container 96bd6f6c9e107a50c2e4c07f75b5c472e606ca40079a00c0311893223b520de7: Status 404 returned error can't find the container with id 96bd6f6c9e107a50c2e4c07f75b5c472e606ca40079a00c0311893223b520de7 Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.565708 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pp9b\" (UniqueName: \"kubernetes.io/projected/18c06222-8721-4d45-aeb5-ea93bab1ea85-kube-api-access-8pp9b\") pod \"18c06222-8721-4d45-aeb5-ea93bab1ea85\" (UID: \"18c06222-8721-4d45-aeb5-ea93bab1ea85\") " Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.571301 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c06222-8721-4d45-aeb5-ea93bab1ea85-kube-api-access-8pp9b" (OuterVolumeSpecName: "kube-api-access-8pp9b") pod "18c06222-8721-4d45-aeb5-ea93bab1ea85" (UID: "18c06222-8721-4d45-aeb5-ea93bab1ea85"). InnerVolumeSpecName "kube-api-access-8pp9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.584883 4762 scope.go:117] "RemoveContainer" containerID="ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64" Feb 17 14:25:00 crc kubenswrapper[4762]: E0217 14:25:00.586383 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64\": container with ID starting with ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64 not found: ID does not exist" containerID="ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.586417 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64"} err="failed to get container status \"ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64\": rpc error: code = NotFound desc = could not find container \"ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64\": container with ID starting with ef2714c84b9cd31f86bce2564164a21a289bb2d47eeb9f88590b83a9051bbb64 not found: ID does not exist" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.667458 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pp9b\" (UniqueName: \"kubernetes.io/projected/18c06222-8721-4d45-aeb5-ea93bab1ea85-kube-api-access-8pp9b\") on node \"crc\" DevicePath \"\"" Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.862563 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ts8xq"] Feb 17 14:25:00 crc kubenswrapper[4762]: I0217 14:25:00.868834 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ts8xq"] Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.541181 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sh6w6" event={"ID":"f96d5046-7e85-41d7-b333-a5d22ef1e541","Type":"ContainerStarted","Data":"ded4fa84f70f6cbb206987ff63e15ae7006deca939daefc8be73a51a53a3159d"} Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.541253 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sh6w6" event={"ID":"f96d5046-7e85-41d7-b333-a5d22ef1e541","Type":"ContainerStarted","Data":"96bd6f6c9e107a50c2e4c07f75b5c472e606ca40079a00c0311893223b520de7"} Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.552374 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"df636eac55e82eeeff5912b4177597ca702604c85c05f46de3a8465568f345f7"} Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.552424 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"361ea75b522d291eccef52941fa39d3d0d507f50169de5f7e84075b5bc369228"} Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.552438 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"7d952cb82cda28c73bb9725e18fae67eff57f9fa6f9a3b0616efb3246cac4ab9"} Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.552449 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kmqrr" event={"ID":"8ff3f905-182a-4670-9789-efea7744fa7a","Type":"ContainerStarted","Data":"52fcc4087ba9aff264db29c7245eaf0213189ee7fbf6f3dd0abcb99d151e2f99"} Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.552566 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.559577 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sh6w6" podStartSLOduration=2.48951041 podStartE2EDuration="2.559560622s" podCreationTimestamp="2026-02-17 14:24:59 +0000 UTC" firstStartedPulling="2026-02-17 14:25:00.536523671 +0000 UTC m=+1181.116524323" lastFinishedPulling="2026-02-17 14:25:00.606573883 +0000 UTC m=+1181.186574535" observedRunningTime="2026-02-17 14:25:01.557747792 +0000 UTC m=+1182.137748464" watchObservedRunningTime="2026-02-17 14:25:01.559560622 +0000 UTC m=+1182.139561274" Feb 17 14:25:01 crc kubenswrapper[4762]: I0217 14:25:01.589958 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kmqrr" podStartSLOduration=6.320618782 podStartE2EDuration="22.589937111s" podCreationTimestamp="2026-02-17 14:24:39 +0000 UTC" firstStartedPulling="2026-02-17 14:24:39.773754948 +0000 UTC m=+1160.353755600" lastFinishedPulling="2026-02-17 14:24:56.043073277 +0000 UTC m=+1176.623073929" observedRunningTime="2026-02-17 14:25:01.58294587 +0000 UTC m=+1182.162946522" watchObservedRunningTime="2026-02-17 14:25:01.589937111 +0000 UTC m=+1182.169937783" Feb 17 14:25:02 crc kubenswrapper[4762]: I0217 14:25:02.089582 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c06222-8721-4d45-aeb5-ea93bab1ea85" path="/var/lib/kubelet/pods/18c06222-8721-4d45-aeb5-ea93bab1ea85/volumes" Feb 17 14:25:04 crc kubenswrapper[4762]: I0217 14:25:04.472435 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:25:05 crc kubenswrapper[4762]: I0217 14:25:04.521996 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:25:05 crc kubenswrapper[4762]: I0217 14:25:05.778263 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:25:05 crc kubenswrapper[4762]: I0217 14:25:05.778597 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:25:09 crc kubenswrapper[4762]: I0217 14:25:09.474916 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kmqrr" Feb 17 14:25:09 crc kubenswrapper[4762]: I0217 14:25:09.682757 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:25:09 crc kubenswrapper[4762]: I0217 14:25:09.685546 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:25:09 crc kubenswrapper[4762]: I0217 14:25:09.721530 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:25:10 crc kubenswrapper[4762]: I0217 14:25:10.154988 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" Feb 17 14:25:10 crc kubenswrapper[4762]: I0217 14:25:10.201289 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-sh6w6" Feb 17 14:25:11 crc kubenswrapper[4762]: I0217 14:25:11.996044 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh"] Feb 17 14:25:11 crc kubenswrapper[4762]: E0217 14:25:11.996824 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c06222-8721-4d45-aeb5-ea93bab1ea85" containerName="registry-server" Feb 17 14:25:11 crc kubenswrapper[4762]: I0217 14:25:11.996840 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c06222-8721-4d45-aeb5-ea93bab1ea85" containerName="registry-server" Feb 17 14:25:11 crc kubenswrapper[4762]: I0217 14:25:11.997074 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c06222-8721-4d45-aeb5-ea93bab1ea85" containerName="registry-server" Feb 17 14:25:11 crc kubenswrapper[4762]: I0217 14:25:11.998720 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.022625 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh"] Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.027196 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gsr5n" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.070374 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-bundle\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.070495 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-util\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.070529 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9t5q\" (UniqueName: \"kubernetes.io/projected/0f03ab51-9f15-43df-b897-d62a6e067994-kube-api-access-z9t5q\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.171630 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-bundle\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.171799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-util\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.172170 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9t5q\" (UniqueName: \"kubernetes.io/projected/0f03ab51-9f15-43df-b897-d62a6e067994-kube-api-access-z9t5q\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.175336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-bundle\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.175377 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-util\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.211009 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9t5q\" (UniqueName: \"kubernetes.io/projected/0f03ab51-9f15-43df-b897-d62a6e067994-kube-api-access-z9t5q\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.387168 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:12 crc kubenswrapper[4762]: I0217 14:25:12.903767 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh"] Feb 17 14:25:13 crc kubenswrapper[4762]: I0217 14:25:13.202772 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" event={"ID":"0f03ab51-9f15-43df-b897-d62a6e067994","Type":"ContainerStarted","Data":"e8430fce0e4d94a7be3541a4dc64b7f9ea8455d50c2d556278be7ec771d21f5d"} Feb 17 14:25:14 crc kubenswrapper[4762]: I0217 14:25:14.217492 4762 generic.go:334] "Generic (PLEG): container finished" podID="0f03ab51-9f15-43df-b897-d62a6e067994" containerID="733f964a2d4a3133c4b78f40766401d70ac2e89e1999d4280a1a18207ae7836a" exitCode=0 Feb 17 14:25:14 crc kubenswrapper[4762]: I0217 14:25:14.217799 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" event={"ID":"0f03ab51-9f15-43df-b897-d62a6e067994","Type":"ContainerDied","Data":"733f964a2d4a3133c4b78f40766401d70ac2e89e1999d4280a1a18207ae7836a"} Feb 17 14:25:16 crc kubenswrapper[4762]: I0217 14:25:16.465936 4762 generic.go:334] "Generic (PLEG): container finished" podID="0f03ab51-9f15-43df-b897-d62a6e067994" containerID="bf5bf6c34a9079af07aa4663409bdea72e59b303eeb0b8d0b2ab12fbe698dfa6" exitCode=0 Feb 17 14:25:16 crc kubenswrapper[4762]: I0217 14:25:16.466021 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" event={"ID":"0f03ab51-9f15-43df-b897-d62a6e067994","Type":"ContainerDied","Data":"bf5bf6c34a9079af07aa4663409bdea72e59b303eeb0b8d0b2ab12fbe698dfa6"} Feb 17 14:25:17 crc kubenswrapper[4762]: I0217 14:25:17.476019 4762 generic.go:334] "Generic (PLEG): container finished" podID="0f03ab51-9f15-43df-b897-d62a6e067994" containerID="893e7c662ec4abd3ecbc7ea6ce0e72a8c61f7c8e47c5429cd5f1f476b087dfe8" exitCode=0 Feb 17 14:25:17 crc kubenswrapper[4762]: I0217 14:25:17.476057 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" event={"ID":"0f03ab51-9f15-43df-b897-d62a6e067994","Type":"ContainerDied","Data":"893e7c662ec4abd3ecbc7ea6ce0e72a8c61f7c8e47c5429cd5f1f476b087dfe8"} Feb 17 14:25:18 crc kubenswrapper[4762]: I0217 14:25:18.890181 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.066783 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-bundle\") pod \"0f03ab51-9f15-43df-b897-d62a6e067994\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.066988 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-util\") pod \"0f03ab51-9f15-43df-b897-d62a6e067994\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.067046 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9t5q\" (UniqueName: \"kubernetes.io/projected/0f03ab51-9f15-43df-b897-d62a6e067994-kube-api-access-z9t5q\") pod \"0f03ab51-9f15-43df-b897-d62a6e067994\" (UID: \"0f03ab51-9f15-43df-b897-d62a6e067994\") " Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.067874 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-bundle" (OuterVolumeSpecName: "bundle") pod "0f03ab51-9f15-43df-b897-d62a6e067994" (UID: "0f03ab51-9f15-43df-b897-d62a6e067994"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.080967 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-util" (OuterVolumeSpecName: "util") pod "0f03ab51-9f15-43df-b897-d62a6e067994" (UID: "0f03ab51-9f15-43df-b897-d62a6e067994"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.088992 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f03ab51-9f15-43df-b897-d62a6e067994-kube-api-access-z9t5q" (OuterVolumeSpecName: "kube-api-access-z9t5q") pod "0f03ab51-9f15-43df-b897-d62a6e067994" (UID: "0f03ab51-9f15-43df-b897-d62a6e067994"). InnerVolumeSpecName "kube-api-access-z9t5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.169386 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.169446 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9t5q\" (UniqueName: \"kubernetes.io/projected/0f03ab51-9f15-43df-b897-d62a6e067994-kube-api-access-z9t5q\") on node \"crc\" DevicePath \"\"" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.169458 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f03ab51-9f15-43df-b897-d62a6e067994-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.490772 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" event={"ID":"0f03ab51-9f15-43df-b897-d62a6e067994","Type":"ContainerDied","Data":"e8430fce0e4d94a7be3541a4dc64b7f9ea8455d50c2d556278be7ec771d21f5d"} Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.490832 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8430fce0e4d94a7be3541a4dc64b7f9ea8455d50c2d556278be7ec771d21f5d" Feb 17 14:25:19 crc kubenswrapper[4762]: I0217 14:25:19.490842 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.092522 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c"] Feb 17 14:25:27 crc kubenswrapper[4762]: E0217 14:25:27.093487 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="extract" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.093503 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="extract" Feb 17 14:25:27 crc kubenswrapper[4762]: E0217 14:25:27.093521 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="util" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.093539 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="util" Feb 17 14:25:27 crc kubenswrapper[4762]: E0217 14:25:27.093579 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="pull" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.093606 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="pull" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.093825 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f03ab51-9f15-43df-b897-d62a6e067994" containerName="extract" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.094500 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.096480 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfrsm\" (UniqueName: \"kubernetes.io/projected/517df0cc-d4c5-41f7-aa3d-53b2830f427c-kube-api-access-jfrsm\") pod \"openstack-operator-controller-init-7464dc569f-ggt7c\" (UID: \"517df0cc-d4c5-41f7-aa3d-53b2830f427c\") " pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.097871 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-xt2xf" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.357019 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfrsm\" (UniqueName: \"kubernetes.io/projected/517df0cc-d4c5-41f7-aa3d-53b2830f427c-kube-api-access-jfrsm\") pod \"openstack-operator-controller-init-7464dc569f-ggt7c\" (UID: \"517df0cc-d4c5-41f7-aa3d-53b2830f427c\") " pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.388514 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfrsm\" (UniqueName: \"kubernetes.io/projected/517df0cc-d4c5-41f7-aa3d-53b2830f427c-kube-api-access-jfrsm\") pod \"openstack-operator-controller-init-7464dc569f-ggt7c\" (UID: \"517df0cc-d4c5-41f7-aa3d-53b2830f427c\") " pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.422838 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.432615 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c"] Feb 17 14:25:27 crc kubenswrapper[4762]: I0217 14:25:27.965806 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c"] Feb 17 14:25:28 crc kubenswrapper[4762]: I0217 14:25:28.923893 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" event={"ID":"517df0cc-d4c5-41f7-aa3d-53b2830f427c","Type":"ContainerStarted","Data":"cce201ff361b54799c9030b3b7a814f756519aeb99215f1a5a4df7de8346c2ec"} Feb 17 14:25:36 crc kubenswrapper[4762]: I0217 14:25:36.205496 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" event={"ID":"517df0cc-d4c5-41f7-aa3d-53b2830f427c","Type":"ContainerStarted","Data":"e5963821fce857265baf9bdbf62645ccacc9f8710e7c02491024ace26ce50ff3"} Feb 17 14:25:36 crc kubenswrapper[4762]: I0217 14:25:36.206907 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:25:36 crc kubenswrapper[4762]: I0217 14:25:36.236133 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" podStartSLOduration=1.7565061960000001 podStartE2EDuration="9.236113177s" podCreationTimestamp="2026-02-17 14:25:27 +0000 UTC" firstStartedPulling="2026-02-17 14:25:27.978643366 +0000 UTC m=+1208.558644018" lastFinishedPulling="2026-02-17 14:25:35.458250347 +0000 UTC m=+1216.038250999" observedRunningTime="2026-02-17 14:25:36.230076172 +0000 UTC m=+1216.810076844" watchObservedRunningTime="2026-02-17 14:25:36.236113177 +0000 UTC m=+1216.816113829" Feb 17 14:25:47 crc kubenswrapper[4762]: I0217 14:25:47.427292 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-ggt7c" Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.921343 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h"] Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.923074 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.925288 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vx4pd" Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.932296 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n"] Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.933549 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.942364 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h"] Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.949494 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-rdjkr" Feb 17 14:26:07 crc kubenswrapper[4762]: I0217 14:26:07.984712 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.005551 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.007183 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.010619 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gqljt" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.034704 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.053962 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-spgjw"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.055671 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.065088 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-spgjw"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.082624 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgd6k\" (UniqueName: \"kubernetes.io/projected/6b0c5012-70b1-42f3-9bf1-734acf6a8f2f-kube-api-access-jgd6k\") pod \"barbican-operator-controller-manager-868647ff47-4bg4h\" (UID: \"6b0c5012-70b1-42f3-9bf1-734acf6a8f2f\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.082701 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbt5\" (UniqueName: \"kubernetes.io/projected/004074b2-55cb-4596-84e6-b715ec66bd2c-kube-api-access-pfbt5\") pod \"cinder-operator-controller-manager-5d946d989d-rnh4n\" (UID: \"004074b2-55cb-4596-84e6-b715ec66bd2c\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.082734 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshd6\" (UniqueName: \"kubernetes.io/projected/bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1-kube-api-access-vshd6\") pod \"designate-operator-controller-manager-6d8bf5c495-ftcx6\" (UID: \"bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.090333 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qjjwl" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.110992 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.123575 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.123622 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.124269 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.124982 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-2k62f"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.126274 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.126788 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.138970 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nppkg" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.139182 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.139189 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bkf9s" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.139315 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-s6bw6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.185132 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.186471 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.188089 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmtwr\" (UniqueName: \"kubernetes.io/projected/f2be497a-b70f-49ca-880e-9675bfd83a93-kube-api-access-fmtwr\") pod \"heat-operator-controller-manager-69f49c598c-ww45l\" (UID: \"f2be497a-b70f-49ca-880e-9675bfd83a93\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.188170 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgd6k\" (UniqueName: \"kubernetes.io/projected/6b0c5012-70b1-42f3-9bf1-734acf6a8f2f-kube-api-access-jgd6k\") pod \"barbican-operator-controller-manager-868647ff47-4bg4h\" (UID: \"6b0c5012-70b1-42f3-9bf1-734acf6a8f2f\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.188199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbt5\" (UniqueName: \"kubernetes.io/projected/004074b2-55cb-4596-84e6-b715ec66bd2c-kube-api-access-pfbt5\") pod \"cinder-operator-controller-manager-5d946d989d-rnh4n\" (UID: \"004074b2-55cb-4596-84e6-b715ec66bd2c\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.188250 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vshd6\" (UniqueName: \"kubernetes.io/projected/bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1-kube-api-access-vshd6\") pod \"designate-operator-controller-manager-6d8bf5c495-ftcx6\" (UID: \"bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.188328 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvckq\" (UniqueName: \"kubernetes.io/projected/6b5af5f5-ea83-427b-b987-f6215d329670-kube-api-access-tvckq\") pod \"glance-operator-controller-manager-77987464f4-spgjw\" (UID: \"6b5af5f5-ea83-427b-b987-f6215d329670\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.188390 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fplf\" (UniqueName: \"kubernetes.io/projected/09b86f06-6cae-45aa-8e1e-8de6408dae32-kube-api-access-5fplf\") pod \"horizon-operator-controller-manager-5b9b8895d5-6mbwp\" (UID: \"09b86f06-6cae-45aa-8e1e-8de6408dae32\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.189048 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kmtrj" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.220726 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.233613 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.253294 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.257361 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshd6\" (UniqueName: \"kubernetes.io/projected/bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1-kube-api-access-vshd6\") pod \"designate-operator-controller-manager-6d8bf5c495-ftcx6\" (UID: \"bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.257717 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgd6k\" (UniqueName: \"kubernetes.io/projected/6b0c5012-70b1-42f3-9bf1-734acf6a8f2f-kube-api-access-jgd6k\") pod \"barbican-operator-controller-manager-868647ff47-4bg4h\" (UID: \"6b0c5012-70b1-42f3-9bf1-734acf6a8f2f\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.266484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbt5\" (UniqueName: \"kubernetes.io/projected/004074b2-55cb-4596-84e6-b715ec66bd2c-kube-api-access-pfbt5\") pod \"cinder-operator-controller-manager-5d946d989d-rnh4n\" (UID: \"004074b2-55cb-4596-84e6-b715ec66bd2c\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.266957 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.267956 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jkm4r" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.294380 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvckq\" (UniqueName: \"kubernetes.io/projected/6b5af5f5-ea83-427b-b987-f6215d329670-kube-api-access-tvckq\") pod \"glance-operator-controller-manager-77987464f4-spgjw\" (UID: \"6b5af5f5-ea83-427b-b987-f6215d329670\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.294456 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7tvz\" (UniqueName: \"kubernetes.io/projected/2ebeafd3-8c4c-4473-b382-7f190a92096a-kube-api-access-l7tvz\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.294507 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fplf\" (UniqueName: \"kubernetes.io/projected/09b86f06-6cae-45aa-8e1e-8de6408dae32-kube-api-access-5fplf\") pod \"horizon-operator-controller-manager-5b9b8895d5-6mbwp\" (UID: \"09b86f06-6cae-45aa-8e1e-8de6408dae32\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.294592 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdp24\" (UniqueName: \"kubernetes.io/projected/6a22270e-2c9e-48d2-8554-8885a67fa92d-kube-api-access-xdp24\") pod \"ironic-operator-controller-manager-554564d7fc-x847n\" (UID: \"6a22270e-2c9e-48d2-8554-8885a67fa92d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.294633 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.294690 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmtwr\" (UniqueName: \"kubernetes.io/projected/f2be497a-b70f-49ca-880e-9675bfd83a93-kube-api-access-fmtwr\") pod \"heat-operator-controller-manager-69f49c598c-ww45l\" (UID: \"f2be497a-b70f-49ca-880e-9675bfd83a93\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.295357 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.336728 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.345026 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmtwr\" (UniqueName: \"kubernetes.io/projected/f2be497a-b70f-49ca-880e-9675bfd83a93-kube-api-access-fmtwr\") pod \"heat-operator-controller-manager-69f49c598c-ww45l\" (UID: \"f2be497a-b70f-49ca-880e-9675bfd83a93\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.368827 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-2k62f"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.398870 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7tvz\" (UniqueName: \"kubernetes.io/projected/2ebeafd3-8c4c-4473-b382-7f190a92096a-kube-api-access-l7tvz\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.399502 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvj9x\" (UniqueName: \"kubernetes.io/projected/0178fd98-dd5b-43f5-b2cd-d118b3803888-kube-api-access-lvj9x\") pod \"keystone-operator-controller-manager-b4d948c87-kt8qn\" (UID: \"0178fd98-dd5b-43f5-b2cd-d118b3803888\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.399591 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdp24\" (UniqueName: \"kubernetes.io/projected/6a22270e-2c9e-48d2-8554-8885a67fa92d-kube-api-access-xdp24\") pod \"ironic-operator-controller-manager-554564d7fc-x847n\" (UID: \"6a22270e-2c9e-48d2-8554-8885a67fa92d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.399636 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: E0217 14:26:08.399821 4762 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:08 crc kubenswrapper[4762]: E0217 14:26:08.399880 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert podName:2ebeafd3-8c4c-4473-b382-7f190a92096a nodeName:}" failed. No retries permitted until 2026-02-17 14:26:08.899860534 +0000 UTC m=+1249.479861186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert") pod "infra-operator-controller-manager-79d975b745-2k62f" (UID: "2ebeafd3-8c4c-4473-b382-7f190a92096a") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.403477 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fplf\" (UniqueName: \"kubernetes.io/projected/09b86f06-6cae-45aa-8e1e-8de6408dae32-kube-api-access-5fplf\") pod \"horizon-operator-controller-manager-5b9b8895d5-6mbwp\" (UID: \"09b86f06-6cae-45aa-8e1e-8de6408dae32\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.409406 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.414237 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvckq\" (UniqueName: \"kubernetes.io/projected/6b5af5f5-ea83-427b-b987-f6215d329670-kube-api-access-tvckq\") pod \"glance-operator-controller-manager-77987464f4-spgjw\" (UID: \"6b5af5f5-ea83-427b-b987-f6215d329670\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.444170 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.468604 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdp24\" (UniqueName: \"kubernetes.io/projected/6a22270e-2c9e-48d2-8554-8885a67fa92d-kube-api-access-xdp24\") pod \"ironic-operator-controller-manager-554564d7fc-x847n\" (UID: \"6a22270e-2c9e-48d2-8554-8885a67fa92d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.469317 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.490532 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.525361 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvj9x\" (UniqueName: \"kubernetes.io/projected/0178fd98-dd5b-43f5-b2cd-d118b3803888-kube-api-access-lvj9x\") pod \"keystone-operator-controller-manager-b4d948c87-kt8qn\" (UID: \"0178fd98-dd5b-43f5-b2cd-d118b3803888\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.556390 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.557658 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7tvz\" (UniqueName: \"kubernetes.io/projected/2ebeafd3-8c4c-4473-b382-7f190a92096a-kube-api-access-l7tvz\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.574164 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.577596 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-cpk52" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.615918 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.616706 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.628483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbj7v\" (UniqueName: \"kubernetes.io/projected/9c5eb531-17f0-4eae-a0a6-f44f2ca0da97-kube-api-access-rbj7v\") pod \"manila-operator-controller-manager-54f6768c69-gtjx5\" (UID: \"9c5eb531-17f0-4eae-a0a6-f44f2ca0da97\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.703063 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.705706 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.715242 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.839001 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lt4kd" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.872753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvj9x\" (UniqueName: \"kubernetes.io/projected/0178fd98-dd5b-43f5-b2cd-d118b3803888-kube-api-access-lvj9x\") pod \"keystone-operator-controller-manager-b4d948c87-kt8qn\" (UID: \"0178fd98-dd5b-43f5-b2cd-d118b3803888\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.877462 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.878855 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcgn9\" (UniqueName: \"kubernetes.io/projected/0cf7a5f5-8168-4054-8aba-55315da55d18-kube-api-access-kcgn9\") pod \"mariadb-operator-controller-manager-6994f66f48-wwhs6\" (UID: \"0cf7a5f5-8168-4054-8aba-55315da55d18\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.879126 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbj7v\" (UniqueName: \"kubernetes.io/projected/9c5eb531-17f0-4eae-a0a6-f44f2ca0da97-kube-api-access-rbj7v\") pod \"manila-operator-controller-manager-54f6768c69-gtjx5\" (UID: \"9c5eb531-17f0-4eae-a0a6-f44f2ca0da97\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.983133 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6"] Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.986787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcgn9\" (UniqueName: \"kubernetes.io/projected/0cf7a5f5-8168-4054-8aba-55315da55d18-kube-api-access-kcgn9\") pod \"mariadb-operator-controller-manager-6994f66f48-wwhs6\" (UID: \"0cf7a5f5-8168-4054-8aba-55315da55d18\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:08 crc kubenswrapper[4762]: I0217 14:26:08.988515 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:08 crc kubenswrapper[4762]: E0217 14:26:08.988972 4762 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:08 crc kubenswrapper[4762]: E0217 14:26:08.989027 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert podName:2ebeafd3-8c4c-4473-b382-7f190a92096a nodeName:}" failed. No retries permitted until 2026-02-17 14:26:09.989010983 +0000 UTC m=+1250.569011635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert") pod "infra-operator-controller-manager-79d975b745-2k62f" (UID: "2ebeafd3-8c4c-4473-b382-7f190a92096a") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.046300 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcgn9\" (UniqueName: \"kubernetes.io/projected/0cf7a5f5-8168-4054-8aba-55315da55d18-kube-api-access-kcgn9\") pod \"mariadb-operator-controller-manager-6994f66f48-wwhs6\" (UID: \"0cf7a5f5-8168-4054-8aba-55315da55d18\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.075753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbj7v\" (UniqueName: \"kubernetes.io/projected/9c5eb531-17f0-4eae-a0a6-f44f2ca0da97-kube-api-access-rbj7v\") pod \"manila-operator-controller-manager-54f6768c69-gtjx5\" (UID: \"9c5eb531-17f0-4eae-a0a6-f44f2ca0da97\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.133866 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.135227 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.142483 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-699gw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.179491 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.192989 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.200841 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.207937 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-w2gwk" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.209075 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8xwz\" (UniqueName: \"kubernetes.io/projected/0c922b97-d376-45cc-986d-c13735e6c43e-kube-api-access-c8xwz\") pod \"neutron-operator-controller-manager-64ddbf8bb-74hcc\" (UID: \"0c922b97-d376-45cc-986d-c13735e6c43e\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.237081 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.238703 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.245428 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-n27t7" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.248996 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.269915 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.271263 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.274826 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.275328 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-g642q" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.279893 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.287625 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.309190 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.310757 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.310997 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.311074 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46tn\" (UniqueName: \"kubernetes.io/projected/b570b810-b8a4-4ca0-89d5-3992368a4867-kube-api-access-g46tn\") pod \"nova-operator-controller-manager-567668f5cf-jh42l\" (UID: \"b570b810-b8a4-4ca0-89d5-3992368a4867\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.311111 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvvv\" (UniqueName: \"kubernetes.io/projected/6abe751d-7643-4aa7-a843-bbde4ed4a457-kube-api-access-lbvvv\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.311141 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxgg\" (UniqueName: \"kubernetes.io/projected/149d4551-5870-46cb-871b-8a0e5dd25508-kube-api-access-gwxgg\") pod \"octavia-operator-controller-manager-69f8888797-xg6kw\" (UID: \"149d4551-5870-46cb-871b-8a0e5dd25508\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.311297 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8xwz\" (UniqueName: \"kubernetes.io/projected/0c922b97-d376-45cc-986d-c13735e6c43e-kube-api-access-c8xwz\") pod \"neutron-operator-controller-manager-64ddbf8bb-74hcc\" (UID: \"0c922b97-d376-45cc-986d-c13735e6c43e\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.314950 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hp8lh" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.329041 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.330523 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.334032 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-f2ddp" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.342580 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.345077 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.356344 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lr5qw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.366720 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.366902 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.379583 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8xwz\" (UniqueName: \"kubernetes.io/projected/0c922b97-d376-45cc-986d-c13735e6c43e-kube-api-access-c8xwz\") pod \"neutron-operator-controller-manager-64ddbf8bb-74hcc\" (UID: \"0c922b97-d376-45cc-986d-c13735e6c43e\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413322 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkr78\" (UniqueName: \"kubernetes.io/projected/2d3c8e1f-e388-467a-a744-5c332868bde3-kube-api-access-xkr78\") pod \"ovn-operator-controller-manager-d44cf6b75-qbgn5\" (UID: \"2d3c8e1f-e388-467a-a744-5c332868bde3\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413402 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413442 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dfx7\" (UniqueName: \"kubernetes.io/projected/afb78ebd-d200-4441-a12f-e1e63dfb71d9-kube-api-access-8dfx7\") pod \"swift-operator-controller-manager-68f46476f-jkgwj\" (UID: \"afb78ebd-d200-4441-a12f-e1e63dfb71d9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413475 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46tn\" (UniqueName: \"kubernetes.io/projected/b570b810-b8a4-4ca0-89d5-3992368a4867-kube-api-access-g46tn\") pod \"nova-operator-controller-manager-567668f5cf-jh42l\" (UID: \"b570b810-b8a4-4ca0-89d5-3992368a4867\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413504 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvvv\" (UniqueName: \"kubernetes.io/projected/6abe751d-7643-4aa7-a843-bbde4ed4a457-kube-api-access-lbvvv\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413534 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxgg\" (UniqueName: \"kubernetes.io/projected/149d4551-5870-46cb-871b-8a0e5dd25508-kube-api-access-gwxgg\") pod \"octavia-operator-controller-manager-69f8888797-xg6kw\" (UID: \"149d4551-5870-46cb-871b-8a0e5dd25508\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.413581 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c289\" (UniqueName: \"kubernetes.io/projected/4414da08-4cca-4b53-b590-3511e77060e0-kube-api-access-5c289\") pod \"placement-operator-controller-manager-8497b45c89-jtvhg\" (UID: \"4414da08-4cca-4b53-b590-3511e77060e0\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.414389 4762 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.414447 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert podName:6abe751d-7643-4aa7-a843-bbde4ed4a457 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:09.914427783 +0000 UTC m=+1250.494428435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" (UID: "6abe751d-7643-4aa7-a843-bbde4ed4a457") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.454526 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46tn\" (UniqueName: \"kubernetes.io/projected/b570b810-b8a4-4ca0-89d5-3992368a4867-kube-api-access-g46tn\") pod \"nova-operator-controller-manager-567668f5cf-jh42l\" (UID: \"b570b810-b8a4-4ca0-89d5-3992368a4867\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.459679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxgg\" (UniqueName: \"kubernetes.io/projected/149d4551-5870-46cb-871b-8a0e5dd25508-kube-api-access-gwxgg\") pod \"octavia-operator-controller-manager-69f8888797-xg6kw\" (UID: \"149d4551-5870-46cb-871b-8a0e5dd25508\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.463493 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvvv\" (UniqueName: \"kubernetes.io/projected/6abe751d-7643-4aa7-a843-bbde4ed4a457-kube-api-access-lbvvv\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.514386 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.514680 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkr78\" (UniqueName: \"kubernetes.io/projected/2d3c8e1f-e388-467a-a744-5c332868bde3-kube-api-access-xkr78\") pod \"ovn-operator-controller-manager-d44cf6b75-qbgn5\" (UID: \"2d3c8e1f-e388-467a-a744-5c332868bde3\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.514756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dfx7\" (UniqueName: \"kubernetes.io/projected/afb78ebd-d200-4441-a12f-e1e63dfb71d9-kube-api-access-8dfx7\") pod \"swift-operator-controller-manager-68f46476f-jkgwj\" (UID: \"afb78ebd-d200-4441-a12f-e1e63dfb71d9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.514805 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c289\" (UniqueName: \"kubernetes.io/projected/4414da08-4cca-4b53-b590-3511e77060e0-kube-api-access-5c289\") pod \"placement-operator-controller-manager-8497b45c89-jtvhg\" (UID: \"4414da08-4cca-4b53-b590-3511e77060e0\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.524095 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.557250 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c289\" (UniqueName: \"kubernetes.io/projected/4414da08-4cca-4b53-b590-3511e77060e0-kube-api-access-5c289\") pod \"placement-operator-controller-manager-8497b45c89-jtvhg\" (UID: \"4414da08-4cca-4b53-b590-3511e77060e0\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.561418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkr78\" (UniqueName: \"kubernetes.io/projected/2d3c8e1f-e388-467a-a744-5c332868bde3-kube-api-access-xkr78\") pod \"ovn-operator-controller-manager-d44cf6b75-qbgn5\" (UID: \"2d3c8e1f-e388-467a-a744-5c332868bde3\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.565961 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dfx7\" (UniqueName: \"kubernetes.io/projected/afb78ebd-d200-4441-a12f-e1e63dfb71d9-kube-api-access-8dfx7\") pod \"swift-operator-controller-manager-68f46476f-jkgwj\" (UID: \"afb78ebd-d200-4441-a12f-e1e63dfb71d9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.599064 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2hv4z"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.600300 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.600406 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.616056 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb5ks\" (UniqueName: \"kubernetes.io/projected/f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb-kube-api-access-mb5ks\") pod \"test-operator-controller-manager-7866795846-2hv4z\" (UID: \"f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.620715 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.629274 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qzpx6" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.654077 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.654763 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2hv4z"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.677835 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.679992 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.686223 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-v74p8" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.699004 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.717444 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb5ks\" (UniqueName: \"kubernetes.io/projected/f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb-kube-api-access-mb5ks\") pod \"test-operator-controller-manager-7866795846-2hv4z\" (UID: \"f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.717497 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdk8k\" (UniqueName: \"kubernetes.io/projected/ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e-kube-api-access-wdk8k\") pod \"telemetry-operator-controller-manager-6d6964fcdb-5jb4z\" (UID: \"ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e\") " pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.736146 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.738051 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.746614 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-s7hvb" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.754510 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb5ks\" (UniqueName: \"kubernetes.io/projected/f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb-kube-api-access-mb5ks\") pod \"test-operator-controller-manager-7866795846-2hv4z\" (UID: \"f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.763520 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.772852 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.797595 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.801633 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.808718 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.809017 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bxmfb" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.814202 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.815005 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.818859 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.823831 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxgf\" (UniqueName: \"kubernetes.io/projected/2dd899d8-8882-45e1-952a-e4103384ac4c-kube-api-access-4xxgf\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.823895 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.823955 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.823997 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdk8k\" (UniqueName: \"kubernetes.io/projected/ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e-kube-api-access-wdk8k\") pod \"telemetry-operator-controller-manager-6d6964fcdb-5jb4z\" (UID: \"ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e\") " pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.824023 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5sl5\" (UniqueName: \"kubernetes.io/projected/a7230b0a-9b7e-4430-843d-7754ba5dc370-kube-api-access-d5sl5\") pod \"watcher-operator-controller-manager-5db88f68c-bzgvz\" (UID: \"a7230b0a-9b7e-4430-843d-7754ba5dc370\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.889336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdk8k\" (UniqueName: \"kubernetes.io/projected/ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e-kube-api-access-wdk8k\") pod \"telemetry-operator-controller-manager-6d6964fcdb-5jb4z\" (UID: \"ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e\") " pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.903802 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x"] Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.906682 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.913377 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-qct2l" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.938576 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxgf\" (UniqueName: \"kubernetes.io/projected/2dd899d8-8882-45e1-952a-e4103384ac4c-kube-api-access-4xxgf\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.938678 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.938729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.938768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5sl5\" (UniqueName: \"kubernetes.io/projected/a7230b0a-9b7e-4430-843d-7754ba5dc370-kube-api-access-d5sl5\") pod \"watcher-operator-controller-manager-5db88f68c-bzgvz\" (UID: \"a7230b0a-9b7e-4430-843d-7754ba5dc370\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.938819 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.938873 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj6r\" (UniqueName: \"kubernetes.io/projected/4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d-kube-api-access-jgj6r\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6pl9x\" (UID: \"4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.939500 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x"] Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.939585 4762 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.940170 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:10.440151282 +0000 UTC m=+1251.020151934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "metrics-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.939629 4762 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.940562 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:10.440551743 +0000 UTC m=+1251.020552395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.939849 4762 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: E0217 14:26:09.940765 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert podName:6abe751d-7643-4aa7-a843-bbde4ed4a457 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:10.940755338 +0000 UTC m=+1251.520756000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" (UID: "6abe751d-7643-4aa7-a843-bbde4ed4a457") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.961743 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxgf\" (UniqueName: \"kubernetes.io/projected/2dd899d8-8882-45e1-952a-e4103384ac4c-kube-api-access-4xxgf\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.966729 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5sl5\" (UniqueName: \"kubernetes.io/projected/a7230b0a-9b7e-4430-843d-7754ba5dc370-kube-api-access-d5sl5\") pod \"watcher-operator-controller-manager-5db88f68c-bzgvz\" (UID: \"a7230b0a-9b7e-4430-843d-7754ba5dc370\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:26:09 crc kubenswrapper[4762]: I0217 14:26:09.986239 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.016724 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.041327 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj6r\" (UniqueName: \"kubernetes.io/projected/4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d-kube-api-access-jgj6r\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6pl9x\" (UID: \"4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.041448 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:10 crc kubenswrapper[4762]: E0217 14:26:10.041603 4762 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:10 crc kubenswrapper[4762]: E0217 14:26:10.041682 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert podName:2ebeafd3-8c4c-4473-b382-7f190a92096a nodeName:}" failed. No retries permitted until 2026-02-17 14:26:12.041661092 +0000 UTC m=+1252.621661744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert") pod "infra-operator-controller-manager-79d975b745-2k62f" (UID: "2ebeafd3-8c4c-4473-b382-7f190a92096a") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.044204 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.076152 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.095599 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.107293 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj6r\" (UniqueName: \"kubernetes.io/projected/4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d-kube-api-access-jgj6r\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6pl9x\" (UID: \"4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.153860 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.243302 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n"] Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.269788 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6"] Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.454909 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.455284 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:10 crc kubenswrapper[4762]: E0217 14:26:10.455448 4762 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:26:10 crc kubenswrapper[4762]: E0217 14:26:10.455582 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:11.455533007 +0000 UTC m=+1252.035533729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "webhook-server-cert" not found Feb 17 14:26:10 crc kubenswrapper[4762]: E0217 14:26:10.456131 4762 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:26:10 crc kubenswrapper[4762]: E0217 14:26:10.456203 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:11.456177734 +0000 UTC m=+1252.036178446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "metrics-server-cert" not found Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.764534 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h"] Feb 17 14:26:10 crc kubenswrapper[4762]: I0217 14:26:10.814563 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l"] Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.012006 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:11 crc kubenswrapper[4762]: E0217 14:26:11.012401 4762 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:11 crc kubenswrapper[4762]: E0217 14:26:11.012478 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert podName:6abe751d-7643-4aa7-a843-bbde4ed4a457 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:13.012445336 +0000 UTC m=+1253.592445988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" (UID: "6abe751d-7643-4aa7-a843-bbde4ed4a457") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.111199 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-spgjw"] Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.246978 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" event={"ID":"f2be497a-b70f-49ca-880e-9675bfd83a93","Type":"ContainerStarted","Data":"239a291dc8278490c7a14bf0e064aefcabbc8ca7d4b2ba161a24de2e1d123fe4"} Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.249175 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" event={"ID":"6b5af5f5-ea83-427b-b987-f6215d329670","Type":"ContainerStarted","Data":"ce434c32559b63349403617d392cbb3f0c546274dcd17b2e26c1d81075e577f7"} Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.257013 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" event={"ID":"6b0c5012-70b1-42f3-9bf1-734acf6a8f2f","Type":"ContainerStarted","Data":"04f6f467bea54cd953a045aaaef0e69cca423cf08c30b322fe1d1e09b0c2e3ee"} Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.259152 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" event={"ID":"bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1","Type":"ContainerStarted","Data":"33cd8706bccc7457e2c1688fac034ea1ac46a757b5b3aed99c140a6191ff3b2d"} Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.260673 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" event={"ID":"004074b2-55cb-4596-84e6-b715ec66bd2c","Type":"ContainerStarted","Data":"ecf9bef9a6c329d6b59137e8766669fb15ddb47651b928ce2c335edc65957533"} Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.526943 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.527105 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:11 crc kubenswrapper[4762]: E0217 14:26:11.527391 4762 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:26:11 crc kubenswrapper[4762]: E0217 14:26:11.527541 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:13.527430841 +0000 UTC m=+1254.107431493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "metrics-server-cert" not found Feb 17 14:26:11 crc kubenswrapper[4762]: E0217 14:26:11.528264 4762 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:26:11 crc kubenswrapper[4762]: E0217 14:26:11.528294 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:13.528286344 +0000 UTC m=+1254.108286996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "webhook-server-cert" not found Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.621583 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn"] Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.659863 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n"] Feb 17 14:26:11 crc kubenswrapper[4762]: I0217 14:26:11.777351 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp"] Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.082858 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:12 crc kubenswrapper[4762]: E0217 14:26:12.083112 4762 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:12 crc kubenswrapper[4762]: E0217 14:26:12.083204 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert podName:2ebeafd3-8c4c-4473-b382-7f190a92096a nodeName:}" failed. No retries permitted until 2026-02-17 14:26:16.083183889 +0000 UTC m=+1256.663184541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert") pod "infra-operator-controller-manager-79d975b745-2k62f" (UID: "2ebeafd3-8c4c-4473-b382-7f190a92096a") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.330566 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" event={"ID":"09b86f06-6cae-45aa-8e1e-8de6408dae32","Type":"ContainerStarted","Data":"23645c880e8c1a8f0faac0bcf9364956599240de05c2bf22296fb2fc06743a8f"} Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.334444 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" event={"ID":"0178fd98-dd5b-43f5-b2cd-d118b3803888","Type":"ContainerStarted","Data":"8d98ce9d06a2f9808a247ef18aa93ecdd3e0f9fad1208af3f6a286e32de3c542"} Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.361098 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" event={"ID":"6a22270e-2c9e-48d2-8554-8885a67fa92d","Type":"ContainerStarted","Data":"b3f55d16b29ac953dd2e825eb9d633b265f0394d008236d0d67adf6ac0c1826e"} Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.818933 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg"] Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.855918 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6"] Feb 17 14:26:12 crc kubenswrapper[4762]: W0217 14:26:12.861389 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4414da08_4cca_4b53_b590_3511e77060e0.slice/crio-2fa1f0b94718c65b3f8e2c38f7bc1554b85b0e310364d3c2d87f4f5c72f8a51e WatchSource:0}: Error finding container 2fa1f0b94718c65b3f8e2c38f7bc1554b85b0e310364d3c2d87f4f5c72f8a51e: Status 404 returned error can't find the container with id 2fa1f0b94718c65b3f8e2c38f7bc1554b85b0e310364d3c2d87f4f5c72f8a51e Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.870860 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5"] Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.920256 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj"] Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.942749 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l"] Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.952814 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc"] Feb 17 14:26:12 crc kubenswrapper[4762]: I0217 14:26:12.960874 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5"] Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.085582 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x"] Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.097419 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2hv4z"] Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.104458 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.104696 4762 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.104766 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert podName:6abe751d-7643-4aa7-a843-bbde4ed4a457 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:17.104748329 +0000 UTC m=+1257.684748981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" (UID: "6abe751d-7643-4aa7-a843-bbde4ed4a457") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.110399 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz"] Feb 17 14:26:13 crc kubenswrapper[4762]: W0217 14:26:13.111843 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c922b97_d376_45cc_986d_c13735e6c43e.slice/crio-d259b78b110d15df22635fc41f9a311efef082e284fc2f83b62e8a7a5b02070f WatchSource:0}: Error finding container d259b78b110d15df22635fc41f9a311efef082e284fc2f83b62e8a7a5b02070f: Status 404 returned error can't find the container with id d259b78b110d15df22635fc41f9a311efef082e284fc2f83b62e8a7a5b02070f Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.235924 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw"] Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.253614 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z"] Feb 17 14:26:13 crc kubenswrapper[4762]: W0217 14:26:13.277508 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149d4551_5870_46cb_871b_8a0e5dd25508.slice/crio-0e2ff9b9e3289f6973bc8ea59ca662d77d6c5f29c2a2104ad1537c9322de1419 WatchSource:0}: Error finding container 0e2ff9b9e3289f6973bc8ea59ca662d77d6c5f29c2a2104ad1537c9322de1419: Status 404 returned error can't find the container with id 0e2ff9b9e3289f6973bc8ea59ca662d77d6c5f29c2a2104ad1537c9322de1419 Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.281720 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwxgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-xg6kw_openstack-operators(149d4551-5870-46cb-871b-8a0e5dd25508): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.283452 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" podUID="149d4551-5870-46cb-871b-8a0e5dd25508" Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.371772 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" event={"ID":"149d4551-5870-46cb-871b-8a0e5dd25508","Type":"ContainerStarted","Data":"0e2ff9b9e3289f6973bc8ea59ca662d77d6c5f29c2a2104ad1537c9322de1419"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.373838 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" event={"ID":"0cf7a5f5-8168-4054-8aba-55315da55d18","Type":"ContainerStarted","Data":"21c77dd9195ea6b28f2fbc1d7a252f94f7cb0dbea9133a52fe97c86cf2a17b7f"} Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.373998 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" podUID="149d4551-5870-46cb-871b-8a0e5dd25508" Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.375999 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" event={"ID":"b570b810-b8a4-4ca0-89d5-3992368a4867","Type":"ContainerStarted","Data":"22e3cf9299f1aac2a42a90711ed26503c2b40678576eda0d9c9a7f271ba57c2f"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.377294 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" event={"ID":"afb78ebd-d200-4441-a12f-e1e63dfb71d9","Type":"ContainerStarted","Data":"a151b8ef33737dced96bbad74939cdb1a1e15fa8c0efd9759f49f7d1431b1f61"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.378879 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" event={"ID":"4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d","Type":"ContainerStarted","Data":"3ef93bb4b9332849df9b5ac2cafb50326faa8fd992371515021c740a3d58568d"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.380469 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" event={"ID":"f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb","Type":"ContainerStarted","Data":"806d0e76584a63114bc07545307c3ffe302f41b88f70e00a1ca366a13cdcd8c9"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.381746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" event={"ID":"9c5eb531-17f0-4eae-a0a6-f44f2ca0da97","Type":"ContainerStarted","Data":"cab507a7cfb75a186ca9edb31c2e8953e67e92fc663130da7ef9bfe1c04d5cfc"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.383049 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" event={"ID":"a7230b0a-9b7e-4430-843d-7754ba5dc370","Type":"ContainerStarted","Data":"bf3056693120b1648885c17923c6acdb3b7d86842210137fa3ccd08d3ed7d249"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.385151 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" event={"ID":"ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e","Type":"ContainerStarted","Data":"869abe1b9d6014cfb8626e3b1a1d28648ea01d8125716c037f1f4b43105b835f"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.394216 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" event={"ID":"2d3c8e1f-e388-467a-a744-5c332868bde3","Type":"ContainerStarted","Data":"6a0070adc7ce3f8e811ac37e28e9c32d99ef5719d722f0079a6d4dc043440a36"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.401240 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" event={"ID":"0c922b97-d376-45cc-986d-c13735e6c43e","Type":"ContainerStarted","Data":"d259b78b110d15df22635fc41f9a311efef082e284fc2f83b62e8a7a5b02070f"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.404936 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" event={"ID":"4414da08-4cca-4b53-b590-3511e77060e0","Type":"ContainerStarted","Data":"2fa1f0b94718c65b3f8e2c38f7bc1554b85b0e310364d3c2d87f4f5c72f8a51e"} Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.631637 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:13 crc kubenswrapper[4762]: I0217 14:26:13.631767 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.632023 4762 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.632142 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:17.632109282 +0000 UTC m=+1258.212110014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "metrics-server-cert" not found Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.632550 4762 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:26:13 crc kubenswrapper[4762]: E0217 14:26:13.632683 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:17.632627426 +0000 UTC m=+1258.212628078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "webhook-server-cert" not found Feb 17 14:26:14 crc kubenswrapper[4762]: E0217 14:26:14.415948 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" podUID="149d4551-5870-46cb-871b-8a0e5dd25508" Feb 17 14:26:16 crc kubenswrapper[4762]: I0217 14:26:16.119698 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:16 crc kubenswrapper[4762]: E0217 14:26:16.120079 4762 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:16 crc kubenswrapper[4762]: E0217 14:26:16.120142 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert podName:2ebeafd3-8c4c-4473-b382-7f190a92096a nodeName:}" failed. No retries permitted until 2026-02-17 14:26:24.120123944 +0000 UTC m=+1264.700124596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert") pod "infra-operator-controller-manager-79d975b745-2k62f" (UID: "2ebeafd3-8c4c-4473-b382-7f190a92096a") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:17 crc kubenswrapper[4762]: I0217 14:26:17.181850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:17 crc kubenswrapper[4762]: E0217 14:26:17.182060 4762 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:17 crc kubenswrapper[4762]: E0217 14:26:17.182438 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert podName:6abe751d-7643-4aa7-a843-bbde4ed4a457 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:25.182418956 +0000 UTC m=+1265.762419608 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" (UID: "6abe751d-7643-4aa7-a843-bbde4ed4a457") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:26:17 crc kubenswrapper[4762]: I0217 14:26:17.703952 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:17 crc kubenswrapper[4762]: I0217 14:26:17.704036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:17 crc kubenswrapper[4762]: E0217 14:26:17.704212 4762 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:26:17 crc kubenswrapper[4762]: E0217 14:26:17.704280 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:25.704261157 +0000 UTC m=+1266.284261809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "metrics-server-cert" not found Feb 17 14:26:17 crc kubenswrapper[4762]: E0217 14:26:17.704853 4762 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:26:17 crc kubenswrapper[4762]: E0217 14:26:17.704944 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:25.704909565 +0000 UTC m=+1266.284910297 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "webhook-server-cert" not found Feb 17 14:26:24 crc kubenswrapper[4762]: I0217 14:26:24.192505 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:24 crc kubenswrapper[4762]: E0217 14:26:24.192785 4762 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:24 crc kubenswrapper[4762]: E0217 14:26:24.193135 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert podName:2ebeafd3-8c4c-4473-b382-7f190a92096a nodeName:}" failed. No retries permitted until 2026-02-17 14:26:40.193108331 +0000 UTC m=+1280.773108983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert") pod "infra-operator-controller-manager-79d975b745-2k62f" (UID: "2ebeafd3-8c4c-4473-b382-7f190a92096a") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.192029 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.200431 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6abe751d-7643-4aa7-a843-bbde4ed4a457-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr\" (UID: \"6abe751d-7643-4aa7-a843-bbde4ed4a457\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.417151 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-g642q" Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.422419 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.803067 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.803453 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:25 crc kubenswrapper[4762]: E0217 14:26:25.803232 4762 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:26:25 crc kubenswrapper[4762]: E0217 14:26:25.803535 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs podName:2dd899d8-8882-45e1-952a-e4103384ac4c nodeName:}" failed. No retries permitted until 2026-02-17 14:26:41.803519271 +0000 UTC m=+1282.383519933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-gddhj" (UID: "2dd899d8-8882-45e1-952a-e4103384ac4c") : secret "webhook-server-cert" not found Feb 17 14:26:25 crc kubenswrapper[4762]: I0217 14:26:25.810156 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:28 crc kubenswrapper[4762]: E0217 14:26:28.396319 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6" Feb 17 14:26:28 crc kubenswrapper[4762]: E0217 14:26:28.397165 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mb5ks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-2hv4z_openstack-operators(f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:28 crc kubenswrapper[4762]: E0217 14:26:28.398414 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" podUID="f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb" Feb 17 14:26:28 crc kubenswrapper[4762]: E0217 14:26:28.937623 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" podUID="f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.029994 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.031120 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdp24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-x847n_openstack-operators(6a22270e-2c9e-48d2-8554-8885a67fa92d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.032470 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" podUID="6a22270e-2c9e-48d2-8554-8885a67fa92d" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.617505 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.617730 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rbj7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-gtjx5_openstack-operators(9c5eb531-17f0-4eae-a0a6-f44f2ca0da97): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.618899 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" podUID="9c5eb531-17f0-4eae-a0a6-f44f2ca0da97" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.969959 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" podUID="6a22270e-2c9e-48d2-8554-8885a67fa92d" Feb 17 14:26:31 crc kubenswrapper[4762]: E0217 14:26:31.976985 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" podUID="9c5eb531-17f0-4eae-a0a6-f44f2ca0da97" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.374465 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.375191 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d5sl5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-bzgvz_openstack-operators(a7230b0a-9b7e-4430-843d-7754ba5dc370): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.376965 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" podUID="a7230b0a-9b7e-4430-843d-7754ba5dc370" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.985027 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" podUID="a7230b0a-9b7e-4430-843d-7754ba5dc370" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.995425 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.995492 4762 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.995722 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdk8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d6964fcdb-5jb4z_openstack-operators(ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:33 crc kubenswrapper[4762]: E0217 14:26:33.997714 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" podUID="ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e" Feb 17 14:26:34 crc kubenswrapper[4762]: E0217 14:26:34.488011 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 17 14:26:34 crc kubenswrapper[4762]: E0217 14:26:34.488222 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5c289,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-jtvhg_openstack-operators(4414da08-4cca-4b53-b590-3511e77060e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:34 crc kubenswrapper[4762]: E0217 14:26:34.489404 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" podUID="4414da08-4cca-4b53-b590-3511e77060e0" Feb 17 14:26:34 crc kubenswrapper[4762]: E0217 14:26:34.993275 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" podUID="4414da08-4cca-4b53-b590-3511e77060e0" Feb 17 14:26:34 crc kubenswrapper[4762]: E0217 14:26:34.994833 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" podUID="ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e" Feb 17 14:26:36 crc kubenswrapper[4762]: E0217 14:26:36.372695 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04" Feb 17 14:26:36 crc kubenswrapper[4762]: E0217 14:26:36.373180 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8dfx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-jkgwj_openstack-operators(afb78ebd-d200-4441-a12f-e1e63dfb71d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:36 crc kubenswrapper[4762]: E0217 14:26:36.374422 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" podUID="afb78ebd-d200-4441-a12f-e1e63dfb71d9" Feb 17 14:26:37 crc kubenswrapper[4762]: E0217 14:26:37.507191 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" podUID="afb78ebd-d200-4441-a12f-e1e63dfb71d9" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.306585 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.306835 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c8xwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-74hcc_openstack-operators(0c922b97-d376-45cc-986d-c13735e6c43e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.308216 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" podUID="0c922b97-d376-45cc-986d-c13735e6c43e" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.501495 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" podUID="0c922b97-d376-45cc-986d-c13735e6c43e" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.820743 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.821195 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kcgn9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-wwhs6_openstack-operators(0cf7a5f5-8168-4054-8aba-55315da55d18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:39 crc kubenswrapper[4762]: E0217 14:26:39.822400 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" podUID="0cf7a5f5-8168-4054-8aba-55315da55d18" Feb 17 14:26:40 crc kubenswrapper[4762]: I0217 14:26:40.252331 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:40 crc kubenswrapper[4762]: I0217 14:26:40.258532 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ebeafd3-8c4c-4473-b382-7f190a92096a-cert\") pod \"infra-operator-controller-manager-79d975b745-2k62f\" (UID: \"2ebeafd3-8c4c-4473-b382-7f190a92096a\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:40 crc kubenswrapper[4762]: I0217 14:26:40.340583 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nppkg" Feb 17 14:26:40 crc kubenswrapper[4762]: I0217 14:26:40.349558 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:40 crc kubenswrapper[4762]: E0217 14:26:40.509740 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" podUID="0cf7a5f5-8168-4054-8aba-55315da55d18" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.348714 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.349105 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xkr78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-qbgn5_openstack-operators(2d3c8e1f-e388-467a-a744-5c332868bde3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.350258 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" podUID="2d3c8e1f-e388-467a-a744-5c332868bde3" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.516370 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" podUID="2d3c8e1f-e388-467a-a744-5c332868bde3" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.826096 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.826314 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g46tn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-jh42l_openstack-operators(b570b810-b8a4-4ca0-89d5-3992368a4867): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:41 crc kubenswrapper[4762]: E0217 14:26:41.827582 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" podUID="b570b810-b8a4-4ca0-89d5-3992368a4867" Feb 17 14:26:41 crc kubenswrapper[4762]: I0217 14:26:41.891664 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:41 crc kubenswrapper[4762]: I0217 14:26:41.901628 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2dd899d8-8882-45e1-952a-e4103384ac4c-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-gddhj\" (UID: \"2dd899d8-8882-45e1-952a-e4103384ac4c\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:41 crc kubenswrapper[4762]: I0217 14:26:41.929440 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bxmfb" Feb 17 14:26:41 crc kubenswrapper[4762]: I0217 14:26:41.937154 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:42 crc kubenswrapper[4762]: E0217 14:26:42.523874 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" podUID="b570b810-b8a4-4ca0-89d5-3992368a4867" Feb 17 14:26:42 crc kubenswrapper[4762]: E0217 14:26:42.855049 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 17 14:26:42 crc kubenswrapper[4762]: E0217 14:26:42.855255 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jgj6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6pl9x_openstack-operators(4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:42 crc kubenswrapper[4762]: E0217 14:26:42.856524 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" podUID="4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d" Feb 17 14:26:43 crc kubenswrapper[4762]: E0217 14:26:43.372868 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 17 14:26:43 crc kubenswrapper[4762]: E0217 14:26:43.373089 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lvj9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-kt8qn_openstack-operators(0178fd98-dd5b-43f5-b2cd-d118b3803888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:43 crc kubenswrapper[4762]: E0217 14:26:43.374372 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" podUID="0178fd98-dd5b-43f5-b2cd-d118b3803888" Feb 17 14:26:43 crc kubenswrapper[4762]: E0217 14:26:43.538771 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" podUID="0178fd98-dd5b-43f5-b2cd-d118b3803888" Feb 17 14:26:43 crc kubenswrapper[4762]: E0217 14:26:43.539248 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" podUID="4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d" Feb 17 14:26:43 crc kubenswrapper[4762]: I0217 14:26:43.793347 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr"] Feb 17 14:26:43 crc kubenswrapper[4762]: I0217 14:26:43.876261 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj"] Feb 17 14:26:44 crc kubenswrapper[4762]: W0217 14:26:44.022494 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ebeafd3_8c4c_4473_b382_7f190a92096a.slice/crio-0c204e28271bc35273d6f977eee5c1cf1c6efc4ebd60b52d4b65ee6d8f1e3d81 WatchSource:0}: Error finding container 0c204e28271bc35273d6f977eee5c1cf1c6efc4ebd60b52d4b65ee6d8f1e3d81: Status 404 returned error can't find the container with id 0c204e28271bc35273d6f977eee5c1cf1c6efc4ebd60b52d4b65ee6d8f1e3d81 Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.029121 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-2k62f"] Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.558598 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" event={"ID":"f2be497a-b70f-49ca-880e-9675bfd83a93","Type":"ContainerStarted","Data":"81ed7a63065fa942ce1cfcdc750a7870c64a97290df0d333020df185082a0a43"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.559978 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.578882 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" event={"ID":"f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb","Type":"ContainerStarted","Data":"104c01637aac9823eab7250782b3d21ae516af35e1494ad43757fd25089134a6"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.579111 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.585425 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" event={"ID":"2dd899d8-8882-45e1-952a-e4103384ac4c","Type":"ContainerStarted","Data":"f8a3848495544b82894ed258ead51e99e142f1dce7b6f36bd978ec339ab76fae"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.585493 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" event={"ID":"2dd899d8-8882-45e1-952a-e4103384ac4c","Type":"ContainerStarted","Data":"5c5b54631f5752e9d68ba6aedca473def54695c6d11c535f616dec27cc3a9542"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.585582 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.596145 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" event={"ID":"09b86f06-6cae-45aa-8e1e-8de6408dae32","Type":"ContainerStarted","Data":"765be1e68f0077d3e24a8203ac9a744ffdd446f2e8fe7124d670f86fee07158f"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.596288 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.598414 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" event={"ID":"6abe751d-7643-4aa7-a843-bbde4ed4a457","Type":"ContainerStarted","Data":"d7c81b6f6cbd3a8d69ce63be3759159d5fc9a90131fb871a3e0706ab46938f3c"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.602699 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" event={"ID":"6b0c5012-70b1-42f3-9bf1-734acf6a8f2f","Type":"ContainerStarted","Data":"28f293c5f3933d99394230021ddc09a2a50f45fc0fec49d05695e8b093d95670"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.603085 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.604773 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" event={"ID":"149d4551-5870-46cb-871b-8a0e5dd25508","Type":"ContainerStarted","Data":"952c7aa7004aa38145dc8f633344e7e453362fed1245c80e53045a63e8172901"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.605006 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.609819 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" podStartSLOduration=5.262290128 podStartE2EDuration="36.609798248s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:11.092079509 +0000 UTC m=+1251.672080161" lastFinishedPulling="2026-02-17 14:26:42.439587629 +0000 UTC m=+1283.019588281" observedRunningTime="2026-02-17 14:26:44.602595871 +0000 UTC m=+1285.182596523" watchObservedRunningTime="2026-02-17 14:26:44.609798248 +0000 UTC m=+1285.189798900" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.610487 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" event={"ID":"004074b2-55cb-4596-84e6-b715ec66bd2c","Type":"ContainerStarted","Data":"9861c6f1636e84a373c5997610ead0b42d82f3b42451203c361ee565ce3904dd"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.610776 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.619755 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" event={"ID":"6b5af5f5-ea83-427b-b987-f6215d329670","Type":"ContainerStarted","Data":"a93e1c6110d76d6de69885aae4ed5cf93c895d48f62ae5d11e5364355fe430e8"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.619952 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.624441 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" event={"ID":"2ebeafd3-8c4c-4473-b382-7f190a92096a","Type":"ContainerStarted","Data":"0c204e28271bc35273d6f977eee5c1cf1c6efc4ebd60b52d4b65ee6d8f1e3d81"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.627537 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" event={"ID":"bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1","Type":"ContainerStarted","Data":"4ad8f85f84f6ddb56fd84ce4fbf75031540d1445f1522e2ba3bd6c8d6fbf013a"} Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.627857 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.660085 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" podStartSLOduration=5.653773071 podStartE2EDuration="37.66006481s" podCreationTimestamp="2026-02-17 14:26:07 +0000 UTC" firstStartedPulling="2026-02-17 14:26:10.830745837 +0000 UTC m=+1251.410746499" lastFinishedPulling="2026-02-17 14:26:42.837037586 +0000 UTC m=+1283.417038238" observedRunningTime="2026-02-17 14:26:44.65603374 +0000 UTC m=+1285.236034392" watchObservedRunningTime="2026-02-17 14:26:44.66006481 +0000 UTC m=+1285.240065462" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.711312 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" podStartSLOduration=35.711292668 podStartE2EDuration="35.711292668s" podCreationTimestamp="2026-02-17 14:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:44.704127762 +0000 UTC m=+1285.284128404" watchObservedRunningTime="2026-02-17 14:26:44.711292668 +0000 UTC m=+1285.291293320" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.740087 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" podStartSLOduration=6.190593094 podStartE2EDuration="36.740060343s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.129391912 +0000 UTC m=+1253.709392564" lastFinishedPulling="2026-02-17 14:26:43.678859161 +0000 UTC m=+1284.258859813" observedRunningTime="2026-02-17 14:26:44.730594175 +0000 UTC m=+1285.310594827" watchObservedRunningTime="2026-02-17 14:26:44.740060343 +0000 UTC m=+1285.320060995" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.762305 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" podStartSLOduration=6.145544654 podStartE2EDuration="36.76228782s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:11.820252603 +0000 UTC m=+1252.400253255" lastFinishedPulling="2026-02-17 14:26:42.436995769 +0000 UTC m=+1283.016996421" observedRunningTime="2026-02-17 14:26:44.759105413 +0000 UTC m=+1285.339106065" watchObservedRunningTime="2026-02-17 14:26:44.76228782 +0000 UTC m=+1285.342288472" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.793066 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" podStartSLOduration=5.559810617 podStartE2EDuration="37.793048529s" podCreationTimestamp="2026-02-17 14:26:07 +0000 UTC" firstStartedPulling="2026-02-17 14:26:11.127594009 +0000 UTC m=+1251.707594661" lastFinishedPulling="2026-02-17 14:26:43.360831921 +0000 UTC m=+1283.940832573" observedRunningTime="2026-02-17 14:26:44.786974493 +0000 UTC m=+1285.366975135" watchObservedRunningTime="2026-02-17 14:26:44.793048529 +0000 UTC m=+1285.373049181" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.814145 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" podStartSLOduration=6.698699171 podStartE2EDuration="36.814122424s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.281500433 +0000 UTC m=+1253.861501085" lastFinishedPulling="2026-02-17 14:26:43.396923686 +0000 UTC m=+1283.976924338" observedRunningTime="2026-02-17 14:26:44.810525626 +0000 UTC m=+1285.390526278" watchObservedRunningTime="2026-02-17 14:26:44.814122424 +0000 UTC m=+1285.394123076" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.839217 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" podStartSLOduration=5.8726731149999996 podStartE2EDuration="37.839199549s" podCreationTimestamp="2026-02-17 14:26:07 +0000 UTC" firstStartedPulling="2026-02-17 14:26:10.472184561 +0000 UTC m=+1251.052185213" lastFinishedPulling="2026-02-17 14:26:42.438710995 +0000 UTC m=+1283.018711647" observedRunningTime="2026-02-17 14:26:44.839086246 +0000 UTC m=+1285.419086898" watchObservedRunningTime="2026-02-17 14:26:44.839199549 +0000 UTC m=+1285.419200191" Feb 17 14:26:44 crc kubenswrapper[4762]: I0217 14:26:44.864923 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" podStartSLOduration=5.794938743 podStartE2EDuration="37.86489801s" podCreationTimestamp="2026-02-17 14:26:07 +0000 UTC" firstStartedPulling="2026-02-17 14:26:10.368838291 +0000 UTC m=+1250.948838943" lastFinishedPulling="2026-02-17 14:26:42.438797558 +0000 UTC m=+1283.018798210" observedRunningTime="2026-02-17 14:26:44.858387372 +0000 UTC m=+1285.438388024" watchObservedRunningTime="2026-02-17 14:26:44.86489801 +0000 UTC m=+1285.444898662" Feb 17 14:26:46 crc kubenswrapper[4762]: I0217 14:26:46.645255 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" event={"ID":"ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e","Type":"ContainerStarted","Data":"943bbf614ebb9cece7737966afa59453aac4fd4d327b25e5d7a73494a8be543e"} Feb 17 14:26:46 crc kubenswrapper[4762]: I0217 14:26:46.646144 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:26:46 crc kubenswrapper[4762]: I0217 14:26:46.671661 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" podStartSLOduration=5.764588066 podStartE2EDuration="38.671626418s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.277025731 +0000 UTC m=+1253.857026383" lastFinishedPulling="2026-02-17 14:26:46.184064083 +0000 UTC m=+1286.764064735" observedRunningTime="2026-02-17 14:26:46.666116228 +0000 UTC m=+1287.246116900" watchObservedRunningTime="2026-02-17 14:26:46.671626418 +0000 UTC m=+1287.251627060" Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.654139 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" event={"ID":"6a22270e-2c9e-48d2-8554-8885a67fa92d","Type":"ContainerStarted","Data":"0de290a222be9784708d219c7af8e17a253f9f15645c7aae15cec4aeb27fda69"} Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.656921 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" event={"ID":"9c5eb531-17f0-4eae-a0a6-f44f2ca0da97","Type":"ContainerStarted","Data":"d265fb715d049a0be1efbb3baf38a0e0e5c30738fefd353c54ff36592055e70e"} Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.657187 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.657958 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" event={"ID":"a7230b0a-9b7e-4430-843d-7754ba5dc370","Type":"ContainerStarted","Data":"7898d00d2e4121e4efb600f99d918e3e994d902721f0a5754375071f80ea90f4"} Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.658058 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.692050 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" podStartSLOduration=6.338793389 podStartE2EDuration="39.692033617s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.272691763 +0000 UTC m=+1253.852692415" lastFinishedPulling="2026-02-17 14:26:46.625931991 +0000 UTC m=+1287.205932643" observedRunningTime="2026-02-17 14:26:47.688217433 +0000 UTC m=+1288.268218085" watchObservedRunningTime="2026-02-17 14:26:47.692033617 +0000 UTC m=+1288.272034279" Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.692331 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" podStartSLOduration=4.968311625 podStartE2EDuration="39.692327245s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:11.801732327 +0000 UTC m=+1252.381732989" lastFinishedPulling="2026-02-17 14:26:46.525747957 +0000 UTC m=+1287.105748609" observedRunningTime="2026-02-17 14:26:47.674357705 +0000 UTC m=+1288.254358367" watchObservedRunningTime="2026-02-17 14:26:47.692327245 +0000 UTC m=+1288.272327897" Feb 17 14:26:47 crc kubenswrapper[4762]: I0217 14:26:47.711071 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" podStartSLOduration=6.139740416 podStartE2EDuration="39.711047396s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:12.957618914 +0000 UTC m=+1253.537619566" lastFinishedPulling="2026-02-17 14:26:46.528925894 +0000 UTC m=+1287.108926546" observedRunningTime="2026-02-17 14:26:47.706598545 +0000 UTC m=+1288.286599217" watchObservedRunningTime="2026-02-17 14:26:47.711047396 +0000 UTC m=+1288.291048058" Feb 17 14:26:48 crc kubenswrapper[4762]: I0217 14:26:48.270002 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rnh4n" Feb 17 14:26:48 crc kubenswrapper[4762]: I0217 14:26:48.343082 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ftcx6" Feb 17 14:26:48 crc kubenswrapper[4762]: I0217 14:26:48.473894 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ww45l" Feb 17 14:26:48 crc kubenswrapper[4762]: I0217 14:26:48.494383 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-6mbwp" Feb 17 14:26:48 crc kubenswrapper[4762]: I0217 14:26:48.560954 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-4bg4h" Feb 17 14:26:48 crc kubenswrapper[4762]: I0217 14:26:48.717833 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-spgjw" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.657607 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-xg6kw" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.677074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" event={"ID":"6abe751d-7643-4aa7-a843-bbde4ed4a457","Type":"ContainerStarted","Data":"d7106281091a91a722634ddd8de764ef1fcf04a8fde940f6cc548153e1aa2556"} Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.677203 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.679188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" event={"ID":"4414da08-4cca-4b53-b590-3511e77060e0","Type":"ContainerStarted","Data":"f536cee22dee3f81fc00770a8a1c72c28aecde5fc090bce362017595da6aed14"} Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.679534 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.681411 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" event={"ID":"afb78ebd-d200-4441-a12f-e1e63dfb71d9","Type":"ContainerStarted","Data":"568c189c7e5dc7a467724f0d31208a0ada707dd4f337531034f760ed4a5c6daa"} Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.681617 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.691939 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" event={"ID":"2ebeafd3-8c4c-4473-b382-7f190a92096a","Type":"ContainerStarted","Data":"b715fc399d2cdad2695fae07e966f283b2a13dd565636a74e29c611175ff849a"} Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.692877 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.732072 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" podStartSLOduration=36.570966608 podStartE2EDuration="41.732046954s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:43.800048298 +0000 UTC m=+1284.380048950" lastFinishedPulling="2026-02-17 14:26:48.961128644 +0000 UTC m=+1289.541129296" observedRunningTime="2026-02-17 14:26:49.722463072 +0000 UTC m=+1290.302463724" watchObservedRunningTime="2026-02-17 14:26:49.732046954 +0000 UTC m=+1290.312047606" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.781918 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" podStartSLOduration=5.71520902 podStartE2EDuration="41.781901204s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:12.896218988 +0000 UTC m=+1253.476219640" lastFinishedPulling="2026-02-17 14:26:48.962911172 +0000 UTC m=+1289.542911824" observedRunningTime="2026-02-17 14:26:49.763152523 +0000 UTC m=+1290.343153185" watchObservedRunningTime="2026-02-17 14:26:49.781901204 +0000 UTC m=+1290.361901856" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.783154 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" podStartSLOduration=5.243819375 podStartE2EDuration="41.783145728s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:12.964950624 +0000 UTC m=+1253.544951276" lastFinishedPulling="2026-02-17 14:26:49.504276977 +0000 UTC m=+1290.084277629" observedRunningTime="2026-02-17 14:26:49.780334151 +0000 UTC m=+1290.360334803" watchObservedRunningTime="2026-02-17 14:26:49.783145728 +0000 UTC m=+1290.363146380" Feb 17 14:26:49 crc kubenswrapper[4762]: I0217 14:26:49.808768 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" podStartSLOduration=36.873977087 podStartE2EDuration="41.808750616s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:44.031414733 +0000 UTC m=+1284.611415385" lastFinishedPulling="2026-02-17 14:26:48.966188262 +0000 UTC m=+1289.546188914" observedRunningTime="2026-02-17 14:26:49.798237009 +0000 UTC m=+1290.378237681" watchObservedRunningTime="2026-02-17 14:26:49.808750616 +0000 UTC m=+1290.388751268" Feb 17 14:26:50 crc kubenswrapper[4762]: I0217 14:26:50.048819 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-2hv4z" Feb 17 14:26:50 crc kubenswrapper[4762]: I0217 14:26:50.096610 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:26:51 crc kubenswrapper[4762]: I0217 14:26:51.710619 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" event={"ID":"0c922b97-d376-45cc-986d-c13735e6c43e","Type":"ContainerStarted","Data":"75a63d9f69e48313bedee19762ae9c04a9ee95478032bcb1004c992b0fcf2dcd"} Feb 17 14:26:51 crc kubenswrapper[4762]: I0217 14:26:51.711167 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:51 crc kubenswrapper[4762]: I0217 14:26:51.732227 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" podStartSLOduration=5.538116677 podStartE2EDuration="43.732208101s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.266749781 +0000 UTC m=+1253.846750433" lastFinishedPulling="2026-02-17 14:26:51.460841205 +0000 UTC m=+1292.040841857" observedRunningTime="2026-02-17 14:26:51.726768433 +0000 UTC m=+1292.306769085" watchObservedRunningTime="2026-02-17 14:26:51.732208101 +0000 UTC m=+1292.312208753" Feb 17 14:26:51 crc kubenswrapper[4762]: I0217 14:26:51.943866 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-gddhj" Feb 17 14:26:52 crc kubenswrapper[4762]: I0217 14:26:52.723417 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" event={"ID":"0cf7a5f5-8168-4054-8aba-55315da55d18","Type":"ContainerStarted","Data":"061e360f58e5e0a3109f3f64d6e4a55f2e2e40ee7ffac85158e629bd24aab457"} Feb 17 14:26:52 crc kubenswrapper[4762]: I0217 14:26:52.724369 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:52 crc kubenswrapper[4762]: I0217 14:26:52.744276 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" podStartSLOduration=5.153742646 podStartE2EDuration="44.744253652s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:12.895962741 +0000 UTC m=+1253.475963393" lastFinishedPulling="2026-02-17 14:26:52.486473747 +0000 UTC m=+1293.066474399" observedRunningTime="2026-02-17 14:26:52.737885468 +0000 UTC m=+1293.317886120" watchObservedRunningTime="2026-02-17 14:26:52.744253652 +0000 UTC m=+1293.324254304" Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.621145 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.621955 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.741857 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" event={"ID":"b570b810-b8a4-4ca0-89d5-3992368a4867","Type":"ContainerStarted","Data":"73fbe742e33a7e98fd8ba05d944f0bbb2a31fff92d53d1db5a2dfac84a30d3f4"} Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.742816 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.743757 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" event={"ID":"2d3c8e1f-e388-467a-a744-5c332868bde3","Type":"ContainerStarted","Data":"626507463cb255ad71728021997ebef4401b2fa57e30c47074e11d2f26874bab"} Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.744065 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.766442 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" podStartSLOduration=5.2904296760000005 podStartE2EDuration="46.76642074s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.049280435 +0000 UTC m=+1253.629281087" lastFinishedPulling="2026-02-17 14:26:54.525271489 +0000 UTC m=+1295.105272151" observedRunningTime="2026-02-17 14:26:54.75872664 +0000 UTC m=+1295.338727292" watchObservedRunningTime="2026-02-17 14:26:54.76642074 +0000 UTC m=+1295.346421392" Feb 17 14:26:54 crc kubenswrapper[4762]: I0217 14:26:54.779707 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" podStartSLOduration=5.597376285 podStartE2EDuration="46.779638221s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.270638167 +0000 UTC m=+1253.850638819" lastFinishedPulling="2026-02-17 14:26:54.452900093 +0000 UTC m=+1295.032900755" observedRunningTime="2026-02-17 14:26:54.77851546 +0000 UTC m=+1295.358516112" watchObservedRunningTime="2026-02-17 14:26:54.779638221 +0000 UTC m=+1295.359638873" Feb 17 14:26:55 crc kubenswrapper[4762]: I0217 14:26:55.430157 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr" Feb 17 14:26:55 crc kubenswrapper[4762]: I0217 14:26:55.752915 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" event={"ID":"4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d","Type":"ContainerStarted","Data":"bdcba27af8fa023c8d3c8c441693ef43851e7224fc56a221a6c5981d40512f43"} Feb 17 14:26:55 crc kubenswrapper[4762]: I0217 14:26:55.771429 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6pl9x" podStartSLOduration=4.529656006 podStartE2EDuration="46.771404628s" podCreationTimestamp="2026-02-17 14:26:09 +0000 UTC" firstStartedPulling="2026-02-17 14:26:13.270683198 +0000 UTC m=+1253.850683860" lastFinishedPulling="2026-02-17 14:26:55.51243184 +0000 UTC m=+1296.092432482" observedRunningTime="2026-02-17 14:26:55.767545213 +0000 UTC m=+1296.347545885" watchObservedRunningTime="2026-02-17 14:26:55.771404628 +0000 UTC m=+1296.351405280" Feb 17 14:26:56 crc kubenswrapper[4762]: I0217 14:26:56.760973 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" event={"ID":"0178fd98-dd5b-43f5-b2cd-d118b3803888","Type":"ContainerStarted","Data":"3fc773fd434ca0d50bcd7d384a6c613d04263c42efbb64f6fabd6b56d963d9b6"} Feb 17 14:26:56 crc kubenswrapper[4762]: I0217 14:26:56.761477 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:26:56 crc kubenswrapper[4762]: I0217 14:26:56.782256 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" podStartSLOduration=3.895827513 podStartE2EDuration="48.782238374s" podCreationTimestamp="2026-02-17 14:26:08 +0000 UTC" firstStartedPulling="2026-02-17 14:26:11.611169996 +0000 UTC m=+1252.191170648" lastFinishedPulling="2026-02-17 14:26:56.497580857 +0000 UTC m=+1297.077581509" observedRunningTime="2026-02-17 14:26:56.776031366 +0000 UTC m=+1297.356032028" watchObservedRunningTime="2026-02-17 14:26:56.782238374 +0000 UTC m=+1297.362239026" Feb 17 14:26:58 crc kubenswrapper[4762]: I0217 14:26:58.619676 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-x847n" Feb 17 14:26:59 crc kubenswrapper[4762]: I0217 14:26:59.519411 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-gtjx5" Feb 17 14:26:59 crc kubenswrapper[4762]: I0217 14:26:59.529456 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wwhs6" Feb 17 14:26:59 crc kubenswrapper[4762]: I0217 14:26:59.604165 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-74hcc" Feb 17 14:26:59 crc kubenswrapper[4762]: I0217 14:26:59.626876 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-jh42l" Feb 17 14:26:59 crc kubenswrapper[4762]: I0217 14:26:59.822927 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-qbgn5" Feb 17 14:26:59 crc kubenswrapper[4762]: I0217 14:26:59.990560 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jtvhg" Feb 17 14:27:00 crc kubenswrapper[4762]: I0217 14:27:00.020455 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jkgwj" Feb 17 14:27:00 crc kubenswrapper[4762]: I0217 14:27:00.093450 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" Feb 17 14:27:00 crc kubenswrapper[4762]: I0217 14:27:00.113875 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bzgvz" Feb 17 14:27:00 crc kubenswrapper[4762]: I0217 14:27:00.361033 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-2k62f" Feb 17 14:27:08 crc kubenswrapper[4762]: I0217 14:27:08.882590 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-kt8qn" Feb 17 14:27:24 crc kubenswrapper[4762]: I0217 14:27:24.621035 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:27:24 crc kubenswrapper[4762]: I0217 14:27:24.622771 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.094175 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggqhx"] Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.096043 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggqhx"] Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.096133 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.102135 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cr2cb" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.102175 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.102332 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.102411 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.147739 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b00521-3bad-4a3b-b706-efd326d22495-config\") pod \"dnsmasq-dns-675f4bcbfc-ggqhx\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.147878 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5hl\" (UniqueName: \"kubernetes.io/projected/38b00521-3bad-4a3b-b706-efd326d22495-kube-api-access-vv5hl\") pod \"dnsmasq-dns-675f4bcbfc-ggqhx\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.168816 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qq4lx"] Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.170573 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.183946 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.185523 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qq4lx"] Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.249983 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.250083 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b00521-3bad-4a3b-b706-efd326d22495-config\") pod \"dnsmasq-dns-675f4bcbfc-ggqhx\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.250110 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-config\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.250217 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5hl\" (UniqueName: \"kubernetes.io/projected/38b00521-3bad-4a3b-b706-efd326d22495-kube-api-access-vv5hl\") pod \"dnsmasq-dns-675f4bcbfc-ggqhx\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.250294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qp9h\" (UniqueName: \"kubernetes.io/projected/1560f7fc-7396-480e-9b67-e62ccdf2b299-kube-api-access-6qp9h\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.251356 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b00521-3bad-4a3b-b706-efd326d22495-config\") pod \"dnsmasq-dns-675f4bcbfc-ggqhx\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.287434 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5hl\" (UniqueName: \"kubernetes.io/projected/38b00521-3bad-4a3b-b706-efd326d22495-kube-api-access-vv5hl\") pod \"dnsmasq-dns-675f4bcbfc-ggqhx\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.372813 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-config\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.372935 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qp9h\" (UniqueName: \"kubernetes.io/projected/1560f7fc-7396-480e-9b67-e62ccdf2b299-kube-api-access-6qp9h\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.372991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.373855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.374510 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-config\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.397227 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qp9h\" (UniqueName: \"kubernetes.io/projected/1560f7fc-7396-480e-9b67-e62ccdf2b299-kube-api-access-6qp9h\") pod \"dnsmasq-dns-78dd6ddcc-qq4lx\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.433367 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.501622 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:27:30 crc kubenswrapper[4762]: I0217 14:27:30.926162 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggqhx"] Feb 17 14:27:31 crc kubenswrapper[4762]: I0217 14:27:31.099512 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qq4lx"] Feb 17 14:27:31 crc kubenswrapper[4762]: W0217 14:27:31.102405 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1560f7fc_7396_480e_9b67_e62ccdf2b299.slice/crio-959d1e26213a71024d15f44b59b9a26f526c2ac15ce099659933b393784d0945 WatchSource:0}: Error finding container 959d1e26213a71024d15f44b59b9a26f526c2ac15ce099659933b393784d0945: Status 404 returned error can't find the container with id 959d1e26213a71024d15f44b59b9a26f526c2ac15ce099659933b393784d0945 Feb 17 14:27:31 crc kubenswrapper[4762]: I0217 14:27:31.232069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" event={"ID":"38b00521-3bad-4a3b-b706-efd326d22495","Type":"ContainerStarted","Data":"0ae2fe04e7b1fa76872016492eb6147f3473124d94b2643fe5832d9db01f10e5"} Feb 17 14:27:31 crc kubenswrapper[4762]: I0217 14:27:31.233195 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" event={"ID":"1560f7fc-7396-480e-9b67-e62ccdf2b299","Type":"ContainerStarted","Data":"959d1e26213a71024d15f44b59b9a26f526c2ac15ce099659933b393784d0945"} Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.630225 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggqhx"] Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.655477 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sml78"] Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.657124 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.676044 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sml78"] Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.711612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx5mk\" (UniqueName: \"kubernetes.io/projected/64dd25ca-1eee-49de-9efd-611c90acb3e2-kube-api-access-nx5mk\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.711716 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-config\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.711819 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-dns-svc\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.814416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx5mk\" (UniqueName: \"kubernetes.io/projected/64dd25ca-1eee-49de-9efd-611c90acb3e2-kube-api-access-nx5mk\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.814475 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-config\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.814577 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-dns-svc\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.815873 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-dns-svc\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.816922 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-config\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:32 crc kubenswrapper[4762]: I0217 14:27:32.843549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx5mk\" (UniqueName: \"kubernetes.io/projected/64dd25ca-1eee-49de-9efd-611c90acb3e2-kube-api-access-nx5mk\") pod \"dnsmasq-dns-666b6646f7-sml78\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.000276 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.008749 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qq4lx"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.057950 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7q75w"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.062958 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.071418 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7q75w"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.248637 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.248801 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-config\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.248854 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgmh\" (UniqueName: \"kubernetes.io/projected/de8fe6a0-5c88-434f-a653-ee334a757900-kube-api-access-lxgmh\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.353335 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-config\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.351631 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-config\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.354934 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgmh\" (UniqueName: \"kubernetes.io/projected/de8fe6a0-5c88-434f-a653-ee334a757900-kube-api-access-lxgmh\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.355129 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.356896 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.384600 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgmh\" (UniqueName: \"kubernetes.io/projected/de8fe6a0-5c88-434f-a653-ee334a757900-kube-api-access-lxgmh\") pod \"dnsmasq-dns-57d769cc4f-7q75w\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.497254 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:27:33 crc kubenswrapper[4762]: W0217 14:27:33.683322 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64dd25ca_1eee_49de_9efd_611c90acb3e2.slice/crio-20f0dc9c3e1911be779bf4b8004e0dcf1f9a0a6b58b0537b101abf6cfede345e WatchSource:0}: Error finding container 20f0dc9c3e1911be779bf4b8004e0dcf1f9a0a6b58b0537b101abf6cfede345e: Status 404 returned error can't find the container with id 20f0dc9c3e1911be779bf4b8004e0dcf1f9a0a6b58b0537b101abf6cfede345e Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.684569 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sml78"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.815690 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.817520 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822117 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822153 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822327 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822580 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822619 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mzmrt" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822772 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.822793 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.856977 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.866127 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.867851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.880556 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.882374 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.895164 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.916873 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.967970 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968187 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968374 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968448 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968600 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12862d08-7816-4a6d-9a52-aceeae5e1d8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968669 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968759 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968818 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12862d08-7816-4a6d-9a52-aceeae5e1d8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:33 crc kubenswrapper[4762]: I0217 14:27:33.968864 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xk6\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-kube-api-access-42xk6\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.022114 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7q75w"] Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074573 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074689 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074854 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d23bccd7-14f7-419d-95db-38470afb02b0-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074904 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/391886d8-341f-4e66-980c-00f6cd881e10-pod-info\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074935 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.074958 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-server-conf\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075009 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075087 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-config-data\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8m5k\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-kube-api-access-m8m5k\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075195 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12862d08-7816-4a6d-9a52-aceeae5e1d8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075270 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075297 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075361 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075393 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075454 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzdx5\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-kube-api-access-vzdx5\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075565 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075731 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075827 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075883 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075908 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075932 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/391886d8-341f-4e66-980c-00f6cd881e10-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075958 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.075994 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12862d08-7816-4a6d-9a52-aceeae5e1d8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076027 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42xk6\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-kube-api-access-42xk6\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076205 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-server-conf\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076520 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d23bccd7-14f7-419d-95db-38470afb02b0-pod-info\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076547 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-config-data\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076585 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076632 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.076728 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.079459 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.079931 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12862d08-7816-4a6d-9a52-aceeae5e1d8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.082901 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.083013 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.083060 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/62c00698853ea6f61441e372a4bbdfc890518599aeb806ef87dabf834350909a/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.083474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/12862d08-7816-4a6d-9a52-aceeae5e1d8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.084100 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.086448 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/12862d08-7816-4a6d-9a52-aceeae5e1d8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.094810 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42xk6\" (UniqueName: \"kubernetes.io/projected/12862d08-7816-4a6d-9a52-aceeae5e1d8e-kube-api-access-42xk6\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.158962 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9c73dcb0-7502-4682-9a44-bf60f7614057\") pod \"rabbitmq-server-0\" (UID: \"12862d08-7816-4a6d-9a52-aceeae5e1d8e\") " pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179577 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179655 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d23bccd7-14f7-419d-95db-38470afb02b0-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179677 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/391886d8-341f-4e66-980c-00f6cd881e10-pod-info\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179696 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-server-conf\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179713 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179734 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-config-data\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179757 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8m5k\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-kube-api-access-m8m5k\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179798 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179861 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179883 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzdx5\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-kube-api-access-vzdx5\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179906 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179925 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179940 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179959 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179973 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.179988 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/391886d8-341f-4e66-980c-00f6cd881e10-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.180021 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-server-conf\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.180044 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.180066 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d23bccd7-14f7-419d-95db-38470afb02b0-pod-info\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.180084 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-config-data\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.181685 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-config-data\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.182401 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.182933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.183566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-config-data\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.183833 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.183940 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d23bccd7-14f7-419d-95db-38470afb02b0-server-conf\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.184491 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.185317 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.185348 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de41b6c63e22e634ade08d0f8c10253ca19a73e4782a72d012eb58384263ee61/globalmount\"" pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.187358 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.188603 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.189403 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.199837 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/391886d8-341f-4e66-980c-00f6cd881e10-server-conf\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.201401 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.210505 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.210952 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.210979 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cbe75989f4d03561427e8df322c3cd1a073f58be1b39841d561c58c528d7dd9d/globalmount\"" pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.213433 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/391886d8-341f-4e66-980c-00f6cd881e10-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.216857 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d23bccd7-14f7-419d-95db-38470afb02b0-pod-info\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.217002 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.217179 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/391886d8-341f-4e66-980c-00f6cd881e10-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.229919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8m5k\" (UniqueName: \"kubernetes.io/projected/d23bccd7-14f7-419d-95db-38470afb02b0-kube-api-access-m8m5k\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.230372 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d23bccd7-14f7-419d-95db-38470afb02b0-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.246237 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzdx5\" (UniqueName: \"kubernetes.io/projected/391886d8-341f-4e66-980c-00f6cd881e10-kube-api-access-vzdx5\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.253528 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/391886d8-341f-4e66-980c-00f6cd881e10-pod-info\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.262478 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.264549 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.267002 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.267216 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.268521 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.268735 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.268884 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.269360 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.269518 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-49sjl" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.316716 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.353495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sml78" event={"ID":"64dd25ca-1eee-49de-9efd-611c90acb3e2","Type":"ContainerStarted","Data":"20f0dc9c3e1911be779bf4b8004e0dcf1f9a0a6b58b0537b101abf6cfede345e"} Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.356515 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" event={"ID":"de8fe6a0-5c88-434f-a653-ee334a757900","Type":"ContainerStarted","Data":"54fa9b45b56eced700a20d20f473dcfe758357fa3c8788ebd5c466d59cad9d20"} Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.372893 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856169a0-c43d-40f4-97cf-3cc3517645e1\") pod \"rabbitmq-server-2\" (UID: \"391886d8-341f-4e66-980c-00f6cd881e10\") " pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.380450 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee554542-d79b-4c5a-be3a-d6dd2ac4bfb7\") pod \"rabbitmq-server-1\" (UID: \"d23bccd7-14f7-419d-95db-38470afb02b0\") " pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386453 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386540 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386587 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386676 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386719 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386767 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386831 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stbb\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-kube-api-access-8stbb\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386857 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386881 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.386940 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.387005 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489055 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489345 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489377 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489436 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489481 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489505 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489545 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stbb\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-kube-api-access-8stbb\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489565 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489591 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489633 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.489703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.491915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.492846 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.493448 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.494546 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.494575 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0920eeeee5df34c13575d50fbf2941384d425d82744a04109ea4ccf56c290e8d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.495578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.496347 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.496430 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.497129 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.498595 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.501284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.509991 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stbb\" (UniqueName: \"kubernetes.io/projected/6c34ffbd-b33d-4579-8a4d-a51ef852b1a1-kube-api-access-8stbb\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.514604 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.540910 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.563576 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3cc6ffd-7a39-4e2e-96dd-d89e7c9bacf3\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.623220 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:27:34 crc kubenswrapper[4762]: I0217 14:27:34.844053 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.258563 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.260552 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.262311 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.266594 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-75h6p" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.266950 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.267517 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.303318 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.307065 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.344731 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.397561 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d23bccd7-14f7-419d-95db-38470afb02b0","Type":"ContainerStarted","Data":"9b0faaf129379aa84805d9fdceb21f1ebe49a06d78c7f72e513ae65ce4873ef5"} Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.398651 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.404296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"12862d08-7816-4a6d-9a52-aceeae5e1d8e","Type":"ContainerStarted","Data":"a2a7161beeb6c0b4a7a283c15288d7a97ca6980f14306f19ec4a45a09ab90ea2"} Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.413833 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fe6d960-8cae-47d2-86e7-c077f0facaae-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.413910 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.413938 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe6d960-8cae-47d2-86e7-c077f0facaae-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.413986 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-config-data-default\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.414033 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-kolla-config\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.414062 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe6d960-8cae-47d2-86e7-c077f0facaae-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.414117 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.414148 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk96w\" (UniqueName: \"kubernetes.io/projected/3fe6d960-8cae-47d2-86e7-c077f0facaae-kube-api-access-kk96w\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.501043 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 14:27:35 crc kubenswrapper[4762]: W0217 14:27:35.513507 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c34ffbd_b33d_4579_8a4d_a51ef852b1a1.slice/crio-817ce2fee11d07146e51cfb31ccf9178c651c5e9802ffa72cb02e2993b5f8957 WatchSource:0}: Error finding container 817ce2fee11d07146e51cfb31ccf9178c651c5e9802ffa72cb02e2993b5f8957: Status 404 returned error can't find the container with id 817ce2fee11d07146e51cfb31ccf9178c651c5e9802ffa72cb02e2993b5f8957 Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-kolla-config\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516517 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe6d960-8cae-47d2-86e7-c077f0facaae-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516583 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516616 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk96w\" (UniqueName: \"kubernetes.io/projected/3fe6d960-8cae-47d2-86e7-c077f0facaae-kube-api-access-kk96w\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516655 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fe6d960-8cae-47d2-86e7-c077f0facaae-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516692 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516714 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe6d960-8cae-47d2-86e7-c077f0facaae-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.516751 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-config-data-default\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.518030 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-config-data-default\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.518492 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-kolla-config\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.519329 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fe6d960-8cae-47d2-86e7-c077f0facaae-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.520746 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fe6d960-8cae-47d2-86e7-c077f0facaae-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.528596 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe6d960-8cae-47d2-86e7-c077f0facaae-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.540348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe6d960-8cae-47d2-86e7-c077f0facaae-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.542390 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.542437 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/54f034933d0ba0d4faf7294bc0ad0c4d60e6325b42edf5d522b4e90a12f6184b/globalmount\"" pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.561950 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk96w\" (UniqueName: \"kubernetes.io/projected/3fe6d960-8cae-47d2-86e7-c077f0facaae-kube-api-access-kk96w\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.627101 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83b88529-a7ca-47a6-bba2-aa8c4c5d93e7\") pod \"openstack-galera-0\" (UID: \"3fe6d960-8cae-47d2-86e7-c077f0facaae\") " pod="openstack/openstack-galera-0" Feb 17 14:27:35 crc kubenswrapper[4762]: I0217 14:27:35.910157 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.418424 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"391886d8-341f-4e66-980c-00f6cd881e10","Type":"ContainerStarted","Data":"2ad6acab3c85286201e0388bae6bb3ea6767a4dfbfef4aeb0d5c09681e0f7cea"} Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.422042 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1","Type":"ContainerStarted","Data":"817ce2fee11d07146e51cfb31ccf9178c651c5e9802ffa72cb02e2993b5f8957"} Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.574473 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.577013 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.579783 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.580104 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gfqrz" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.580320 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.581128 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.585243 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652150 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652256 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652298 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652384 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652614 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652660 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcv8g\" (UniqueName: \"kubernetes.io/projected/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-kube-api-access-qcv8g\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.652715 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756575 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756680 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756718 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcv8g\" (UniqueName: \"kubernetes.io/projected/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-kube-api-access-qcv8g\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756780 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756815 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756926 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.756954 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.757490 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.762857 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.766026 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.772556 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.774161 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.781703 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcv8g\" (UniqueName: \"kubernetes.io/projected/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-kube-api-access-qcv8g\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:36 crc kubenswrapper[4762]: I0217 14:27:36.805274 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd5850c-1106-4dd4-a7d7-b13e08eff2f5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.077934 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.082048 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.086034 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.092182 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-swmx8" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.097022 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.124115 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.251717 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.251993 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ff998db00a3267977d1a188f3075bfb93db34edaff8993d7a9a026447e21ed1/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.326095 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.326293 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6ff\" (UniqueName: \"kubernetes.io/projected/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-kube-api-access-gv6ff\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.326391 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-config-data\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.326531 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.326573 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-kolla-config\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.332616 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63462bc4-44bd-4dc1-8795-297b843fe9db\") pod \"openstack-cell1-galera-0\" (UID: \"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.428904 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.429043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv6ff\" (UniqueName: \"kubernetes.io/projected/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-kube-api-access-gv6ff\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.429111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-config-data\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.429181 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.429212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-kolla-config\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.431085 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-kolla-config\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.432758 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-config-data\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.438071 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.439240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.448895 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv6ff\" (UniqueName: \"kubernetes.io/projected/b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c-kube-api-access-gv6ff\") pod \"memcached-0\" (UID: \"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c\") " pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.478956 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.518869 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 14:27:37 crc kubenswrapper[4762]: I0217 14:27:37.860667 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 14:27:38 crc kubenswrapper[4762]: I0217 14:27:38.234626 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 14:27:38 crc kubenswrapper[4762]: I0217 14:27:38.551484 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fe6d960-8cae-47d2-86e7-c077f0facaae","Type":"ContainerStarted","Data":"c643a2a514a66d21687e57194b7c97c5a2a1811ec8058831381086bfdd462cf7"} Feb 17 14:27:38 crc kubenswrapper[4762]: I0217 14:27:38.616913 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 14:27:39 crc kubenswrapper[4762]: I0217 14:27:39.633418 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:27:39 crc kubenswrapper[4762]: I0217 14:27:39.635151 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:27:39 crc kubenswrapper[4762]: I0217 14:27:39.641697 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4z8jj" Feb 17 14:27:39 crc kubenswrapper[4762]: I0217 14:27:39.654416 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:27:40 crc kubenswrapper[4762]: I0217 14:27:40.057992 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9967c\" (UniqueName: \"kubernetes.io/projected/6d19ed64-87e9-4afd-9c02-4319baed9bda-kube-api-access-9967c\") pod \"kube-state-metrics-0\" (UID: \"6d19ed64-87e9-4afd-9c02-4319baed9bda\") " pod="openstack/kube-state-metrics-0" Feb 17 14:27:40 crc kubenswrapper[4762]: I0217 14:27:40.164346 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9967c\" (UniqueName: \"kubernetes.io/projected/6d19ed64-87e9-4afd-9c02-4319baed9bda-kube-api-access-9967c\") pod \"kube-state-metrics-0\" (UID: \"6d19ed64-87e9-4afd-9c02-4319baed9bda\") " pod="openstack/kube-state-metrics-0" Feb 17 14:27:40 crc kubenswrapper[4762]: I0217 14:27:40.202098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9967c\" (UniqueName: \"kubernetes.io/projected/6d19ed64-87e9-4afd-9c02-4319baed9bda-kube-api-access-9967c\") pod \"kube-state-metrics-0\" (UID: \"6d19ed64-87e9-4afd-9c02-4319baed9bda\") " pod="openstack/kube-state-metrics-0" Feb 17 14:27:40 crc kubenswrapper[4762]: I0217 14:27:40.286116 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.134319 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-5jb4z" podUID="ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.382900 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.389398 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.395721 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.395950 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.396088 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.396321 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xlgsb" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.396438 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.396563 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.396609 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.398695 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.402928 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.557941 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.557994 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.558198 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.558362 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.558709 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.558851 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.558924 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9mp\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-kube-api-access-bx9mp\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.559089 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.559126 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.559152 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.661635 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.661740 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.661782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.661942 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.661988 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.662065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.662134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.662228 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.662315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.662382 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9mp\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-kube-api-access-bx9mp\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.664248 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.677511 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.683074 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.839011 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.839482 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.840389 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.846239 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.846294 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a14786d82eecf667a32c06b804e4be54e2c76b1ecf1137b60c795c6a56a8bc4a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.847663 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.870633 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.882457 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9mp\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-kube-api-access-bx9mp\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:41 crc kubenswrapper[4762]: I0217 14:27:41.942753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.043058 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.468808 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xspft"] Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.470617 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.475087 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lvljn" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.475317 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.475448 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.484600 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7gshj"] Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.487502 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.494569 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xspft"] Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.542065 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7gshj"] Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.664228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-log-ovn\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.664291 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/549db29e-a842-49dc-8b6b-1fe3f83857da-scripts\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.664323 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-etc-ovs\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667189 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0611dcb7-08c7-4999-8bc2-210224f89e66-ovn-controller-tls-certs\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667277 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-run\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667299 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-log\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-run-ovn\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667417 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-lib\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0611dcb7-08c7-4999-8bc2-210224f89e66-scripts\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667464 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbgcl\" (UniqueName: \"kubernetes.io/projected/549db29e-a842-49dc-8b6b-1fe3f83857da-kube-api-access-rbgcl\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667528 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvl64\" (UniqueName: \"kubernetes.io/projected/0611dcb7-08c7-4999-8bc2-210224f89e66-kube-api-access-pvl64\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667567 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0611dcb7-08c7-4999-8bc2-210224f89e66-combined-ca-bundle\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.667584 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-run\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.770422 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-lib\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.770531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0611dcb7-08c7-4999-8bc2-210224f89e66-scripts\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.770566 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbgcl\" (UniqueName: \"kubernetes.io/projected/549db29e-a842-49dc-8b6b-1fe3f83857da-kube-api-access-rbgcl\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.770878 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvl64\" (UniqueName: \"kubernetes.io/projected/0611dcb7-08c7-4999-8bc2-210224f89e66-kube-api-access-pvl64\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.770971 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0611dcb7-08c7-4999-8bc2-210224f89e66-combined-ca-bundle\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.771020 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-run\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.771116 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-log-ovn\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.771240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-lib\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.771145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/549db29e-a842-49dc-8b6b-1fe3f83857da-scripts\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.773409 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-etc-ovs\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.773502 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0611dcb7-08c7-4999-8bc2-210224f89e66-ovn-controller-tls-certs\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.773607 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-run\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.773679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-log\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.773770 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-run-ovn\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.772266 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-log-ovn\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.772117 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-run\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.775202 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-var-log\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.775199 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-run\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.775280 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/549db29e-a842-49dc-8b6b-1fe3f83857da-etc-ovs\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.775662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0611dcb7-08c7-4999-8bc2-210224f89e66-var-run-ovn\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.777099 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0611dcb7-08c7-4999-8bc2-210224f89e66-scripts\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.777936 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/549db29e-a842-49dc-8b6b-1fe3f83857da-scripts\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.782376 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0611dcb7-08c7-4999-8bc2-210224f89e66-ovn-controller-tls-certs\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.786298 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0611dcb7-08c7-4999-8bc2-210224f89e66-combined-ca-bundle\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.792157 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvl64\" (UniqueName: \"kubernetes.io/projected/0611dcb7-08c7-4999-8bc2-210224f89e66-kube-api-access-pvl64\") pod \"ovn-controller-xspft\" (UID: \"0611dcb7-08c7-4999-8bc2-210224f89e66\") " pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.798000 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbgcl\" (UniqueName: \"kubernetes.io/projected/549db29e-a842-49dc-8b6b-1fe3f83857da-kube-api-access-rbgcl\") pod \"ovn-controller-ovs-7gshj\" (UID: \"549db29e-a842-49dc-8b6b-1fe3f83857da\") " pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.858266 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.885761 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.928449 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.930104 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.932198 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.934150 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.934226 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.934336 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-f8bmx" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.934377 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 14:27:42 crc kubenswrapper[4762]: I0217 14:27:42.943517 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081186 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86cgc\" (UniqueName: \"kubernetes.io/projected/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-kube-api-access-86cgc\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081251 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081338 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081565 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081884 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081924 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.081986 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-config\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.186794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86cgc\" (UniqueName: \"kubernetes.io/projected/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-kube-api-access-86cgc\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.186884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.186947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.187012 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.187084 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.187222 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.187272 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.187315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-config\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.192360 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.193824 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-config\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.197840 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.200171 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.200483 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e614f225fb571067c13c0157a1d95e4a6cdc0b6414c192bcab4f8b684f66dce3/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.208575 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.211770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.214268 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.217565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86cgc\" (UniqueName: \"kubernetes.io/projected/de4ebcd7-ede5-4a4a-aed5-55d31eee13bf-kube-api-access-86cgc\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.576766 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-656mp"] Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.579001 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.590933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b4d5a236-abfa-4b4a-a7e2-4ac9b5ba60bd\") pod \"ovsdbserver-nb-0\" (UID: \"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.600818 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-vwd7l" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.601005 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.625802 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-656mp"] Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.711543 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q9x4\" (UniqueName: \"kubernetes.io/projected/0e153059-08c6-4155-af14-f724a156b6fd-kube-api-access-7q9x4\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.711610 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e153059-08c6-4155-af14-f724a156b6fd-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.813223 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q9x4\" (UniqueName: \"kubernetes.io/projected/0e153059-08c6-4155-af14-f724a156b6fd-kube-api-access-7q9x4\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.813298 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e153059-08c6-4155-af14-f724a156b6fd-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:43 crc kubenswrapper[4762]: E0217 14:27:43.813523 4762 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Feb 17 14:27:43 crc kubenswrapper[4762]: E0217 14:27:43.813580 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e153059-08c6-4155-af14-f724a156b6fd-serving-cert podName:0e153059-08c6-4155-af14-f724a156b6fd nodeName:}" failed. No retries permitted until 2026-02-17 14:27:44.313557219 +0000 UTC m=+1344.893557871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0e153059-08c6-4155-af14-f724a156b6fd-serving-cert") pod "observability-ui-dashboards-66cbf594b5-656mp" (UID: "0e153059-08c6-4155-af14-f724a156b6fd") : secret "observability-ui-dashboards" not found Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.865297 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 14:27:43 crc kubenswrapper[4762]: I0217 14:27:43.881154 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q9x4\" (UniqueName: \"kubernetes.io/projected/0e153059-08c6-4155-af14-f724a156b6fd-kube-api-access-7q9x4\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.031599 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-d5bd55fbc-55znb"] Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.048774 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.059894 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d5bd55fbc-55znb"] Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222486 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkg6r\" (UniqueName: \"kubernetes.io/projected/0ac8722c-6cf3-4581-8107-ae03a6198beb-kube-api-access-nkg6r\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222552 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-config\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222590 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-service-ca\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222692 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-trusted-ca-bundle\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-serving-cert\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222836 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-oauth-serving-cert\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.222879 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-oauth-config\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.326534 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-oauth-serving-cert\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.326612 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-oauth-config\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.326722 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e153059-08c6-4155-af14-f724a156b6fd-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.326796 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkg6r\" (UniqueName: \"kubernetes.io/projected/0ac8722c-6cf3-4581-8107-ae03a6198beb-kube-api-access-nkg6r\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.326826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-config\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.326995 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-service-ca\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.327071 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-trusted-ca-bundle\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.327109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-serving-cert\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.331518 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-oauth-serving-cert\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.331565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-service-ca\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.331959 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-config\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.332768 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac8722c-6cf3-4581-8107-ae03a6198beb-trusted-ca-bundle\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.332966 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-serving-cert\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.333922 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac8722c-6cf3-4581-8107-ae03a6198beb-console-oauth-config\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.342225 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e153059-08c6-4155-af14-f724a156b6fd-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-656mp\" (UID: \"0e153059-08c6-4155-af14-f724a156b6fd\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.359322 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkg6r\" (UniqueName: \"kubernetes.io/projected/0ac8722c-6cf3-4581-8107-ae03a6198beb-kube-api-access-nkg6r\") pod \"console-d5bd55fbc-55znb\" (UID: \"0ac8722c-6cf3-4581-8107-ae03a6198beb\") " pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.396252 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:27:44 crc kubenswrapper[4762]: I0217 14:27:44.523491 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.672335 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.680586 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.686226 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.686760 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.686931 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fkxwh" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.687110 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.721786 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.724529 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c","Type":"ContainerStarted","Data":"90de91a7c2b9753760077e418d70a3f94d83da2013a6507be2bad669b9446232"} Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.726492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5","Type":"ContainerStarted","Data":"a19ec027eb952203e8507429717ffbdddeedd95973f0c95cc940dc290404ecec"} Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.790898 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d44f-ad87-4491-a0af-c2028ee1827b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.790991 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.791011 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.791078 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b848d44f-ad87-4491-a0af-c2028ee1827b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.791289 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88949132-c261-45b2-b4d3-856cccca2530\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88949132-c261-45b2-b4d3-856cccca2530\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.791408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7t5\" (UniqueName: \"kubernetes.io/projected/b848d44f-ad87-4491-a0af-c2028ee1827b-kube-api-access-wc7t5\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.791456 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b848d44f-ad87-4491-a0af-c2028ee1827b-config\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.791492 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.898610 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d44f-ad87-4491-a0af-c2028ee1827b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.898688 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.898716 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.898864 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b848d44f-ad87-4491-a0af-c2028ee1827b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.898979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88949132-c261-45b2-b4d3-856cccca2530\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88949132-c261-45b2-b4d3-856cccca2530\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.899097 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7t5\" (UniqueName: \"kubernetes.io/projected/b848d44f-ad87-4491-a0af-c2028ee1827b-kube-api-access-wc7t5\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.899146 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b848d44f-ad87-4491-a0af-c2028ee1827b-config\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.899167 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.900867 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b848d44f-ad87-4491-a0af-c2028ee1827b-config\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.901246 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b848d44f-ad87-4491-a0af-c2028ee1827b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.904684 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.904950 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88949132-c261-45b2-b4d3-856cccca2530\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88949132-c261-45b2-b4d3-856cccca2530\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/25615cd759fe3f51c3fc68193397d6c056c08aab4dbc4aa3507da9590dd40a9a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.910377 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d44f-ad87-4491-a0af-c2028ee1827b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.912826 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.922626 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7t5\" (UniqueName: \"kubernetes.io/projected/b848d44f-ad87-4491-a0af-c2028ee1827b-kube-api-access-wc7t5\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.930567 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.934517 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b848d44f-ad87-4491-a0af-c2028ee1827b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:46 crc kubenswrapper[4762]: I0217 14:27:46.962592 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88949132-c261-45b2-b4d3-856cccca2530\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88949132-c261-45b2-b4d3-856cccca2530\") pod \"ovsdbserver-sb-0\" (UID: \"b848d44f-ad87-4491-a0af-c2028ee1827b\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:47 crc kubenswrapper[4762]: I0217 14:27:47.011495 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.621880 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.622338 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.622389 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.623223 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f57f792acac65c40f56a21d9846b71db555cf9b18e70e6ffc6202b1c323fd44"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.623543 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://1f57f792acac65c40f56a21d9846b71db555cf9b18e70e6ffc6202b1c323fd44" gracePeriod=600 Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.807280 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="1f57f792acac65c40f56a21d9846b71db555cf9b18e70e6ffc6202b1c323fd44" exitCode=0 Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.807325 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"1f57f792acac65c40f56a21d9846b71db555cf9b18e70e6ffc6202b1c323fd44"} Feb 17 14:27:54 crc kubenswrapper[4762]: I0217 14:27:54.807404 4762 scope.go:117] "RemoveContainer" containerID="ccc577972b61cd413548bab4efa2b49055d0a18dd9858698cc28b4b73b495bf9" Feb 17 14:27:59 crc kubenswrapper[4762]: E0217 14:27:59.746564 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:27:59 crc kubenswrapper[4762]: E0217 14:27:59.747374 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nx5mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-sml78_openstack(64dd25ca-1eee-49de-9efd-611c90acb3e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:27:59 crc kubenswrapper[4762]: E0217 14:27:59.749075 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-sml78" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" Feb 17 14:27:59 crc kubenswrapper[4762]: E0217 14:27:59.864313 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-sml78" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.729859 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.730455 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vv5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ggqhx_openstack(38b00521-3bad-4a3b-b706-efd326d22495): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.731705 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" podUID="38b00521-3bad-4a3b-b706-efd326d22495" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.763136 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.763380 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qp9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-qq4lx_openstack(1560f7fc-7396-480e-9b67-e62ccdf2b299): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.764604 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" podUID="1560f7fc-7396-480e-9b67-e62ccdf2b299" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.843108 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.843337 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxgmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-7q75w_openstack(de8fe6a0-5c88-434f-a653-ee334a757900): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.844693 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" podUID="de8fe6a0-5c88-434f-a653-ee334a757900" Feb 17 14:28:00 crc kubenswrapper[4762]: E0217 14:28:00.959284 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" podUID="de8fe6a0-5c88-434f-a653-ee334a757900" Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.532761 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xspft"] Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.902687 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c","Type":"ContainerStarted","Data":"cd4c0f06e72f28dc478e8e835b83405e250e7b65458d17118bae896283e888bb"} Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.904827 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.938990 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5","Type":"ContainerStarted","Data":"243b4545053a8b88d1eb353d8420301b97eb24c9ab8c666136a01d7f4a2f7516"} Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.948963 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fe6d960-8cae-47d2-86e7-c077f0facaae","Type":"ContainerStarted","Data":"a7d1866c07724dad6fd9a89269b470fc497ea5d389364c6880bc74199b76851b"} Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.964031 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46"} Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.971093 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.405521169 podStartE2EDuration="25.971071061s" podCreationTimestamp="2026-02-17 14:27:36 +0000 UTC" firstStartedPulling="2026-02-17 14:27:46.556939167 +0000 UTC m=+1347.136939819" lastFinishedPulling="2026-02-17 14:28:01.122489059 +0000 UTC m=+1361.702489711" observedRunningTime="2026-02-17 14:28:01.963218407 +0000 UTC m=+1362.543219059" watchObservedRunningTime="2026-02-17 14:28:01.971071061 +0000 UTC m=+1362.551071723" Feb 17 14:28:01 crc kubenswrapper[4762]: I0217 14:28:01.982110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft" event={"ID":"0611dcb7-08c7-4999-8bc2-210224f89e66","Type":"ContainerStarted","Data":"16dfc695db43cfec2abdb23fa3871a5a9a192c1f8ec5aa9bfa2dcd4fdd0fbeb0"} Feb 17 14:28:02 crc kubenswrapper[4762]: I0217 14:28:02.054241 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:28:02 crc kubenswrapper[4762]: I0217 14:28:02.413898 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d5bd55fbc-55znb"] Feb 17 14:28:02 crc kubenswrapper[4762]: I0217 14:28:02.453058 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:28:02 crc kubenswrapper[4762]: I0217 14:28:02.471975 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-656mp"] Feb 17 14:28:02 crc kubenswrapper[4762]: W0217 14:28:02.519527 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac8722c_6cf3_4581_8107_ae03a6198beb.slice/crio-98993bf6dba5c434bbcc5e93a735037ace66d552d6cafba9101c6a3528d2d6a6 WatchSource:0}: Error finding container 98993bf6dba5c434bbcc5e93a735037ace66d552d6cafba9101c6a3528d2d6a6: Status 404 returned error can't find the container with id 98993bf6dba5c434bbcc5e93a735037ace66d552d6cafba9101c6a3528d2d6a6 Feb 17 14:28:02 crc kubenswrapper[4762]: W0217 14:28:02.523472 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d19ed64_87e9_4afd_9c02_4319baed9bda.slice/crio-1df58b4fd92738c11d81716ff930e671f339de9e1442edaa30e82ee552ff13dc WatchSource:0}: Error finding container 1df58b4fd92738c11d81716ff930e671f339de9e1442edaa30e82ee552ff13dc: Status 404 returned error can't find the container with id 1df58b4fd92738c11d81716ff930e671f339de9e1442edaa30e82ee552ff13dc Feb 17 14:28:02 crc kubenswrapper[4762]: W0217 14:28:02.527152 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e153059_08c6_4155_af14_f724a156b6fd.slice/crio-569f1142eb2d456fef05801d4764616422956d5a0306830236208478663eb264 WatchSource:0}: Error finding container 569f1142eb2d456fef05801d4764616422956d5a0306830236208478663eb264: Status 404 returned error can't find the container with id 569f1142eb2d456fef05801d4764616422956d5a0306830236208478663eb264 Feb 17 14:28:02 crc kubenswrapper[4762]: W0217 14:28:02.802378 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb848d44f_ad87_4491_a0af_c2028ee1827b.slice/crio-49852693fde3606b58138f29a39b2ae93371ab9cb02df382fbd4ec77ce736b5c WatchSource:0}: Error finding container 49852693fde3606b58138f29a39b2ae93371ab9cb02df382fbd4ec77ce736b5c: Status 404 returned error can't find the container with id 49852693fde3606b58138f29a39b2ae93371ab9cb02df382fbd4ec77ce736b5c Feb 17 14:28:02 crc kubenswrapper[4762]: I0217 14:28:02.806116 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:28:02 crc kubenswrapper[4762]: I0217 14:28:02.813050 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.037825 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" event={"ID":"1560f7fc-7396-480e-9b67-e62ccdf2b299","Type":"ContainerDied","Data":"959d1e26213a71024d15f44b59b9a26f526c2ac15ce099659933b393784d0945"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.037878 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959d1e26213a71024d15f44b59b9a26f526c2ac15ce099659933b393784d0945" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.043161 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"12862d08-7816-4a6d-9a52-aceeae5e1d8e","Type":"ContainerStarted","Data":"b11db3113125fb889927cf674d2bbcd1aa7731c1f11642c52f42397ac3ed0e4d"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.047396 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerStarted","Data":"ea238ac7460842a43b0355902aebd50619903e918c2c80fb84a477ab2ce9c7f9"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.083744 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d5bd55fbc-55znb" event={"ID":"0ac8722c-6cf3-4581-8107-ae03a6198beb","Type":"ContainerStarted","Data":"98993bf6dba5c434bbcc5e93a735037ace66d552d6cafba9101c6a3528d2d6a6"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.090370 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1","Type":"ContainerStarted","Data":"871f822e9905255baedc928635c7f6e04ebc6715f1e03baf39953b705867f569"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.093318 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6d19ed64-87e9-4afd-9c02-4319baed9bda","Type":"ContainerStarted","Data":"1df58b4fd92738c11d81716ff930e671f339de9e1442edaa30e82ee552ff13dc"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.096070 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b848d44f-ad87-4491-a0af-c2028ee1827b","Type":"ContainerStarted","Data":"49852693fde3606b58138f29a39b2ae93371ab9cb02df382fbd4ec77ce736b5c"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.098320 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" event={"ID":"38b00521-3bad-4a3b-b706-efd326d22495","Type":"ContainerDied","Data":"0ae2fe04e7b1fa76872016492eb6147f3473124d94b2643fe5832d9db01f10e5"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.098371 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae2fe04e7b1fa76872016492eb6147f3473124d94b2643fe5832d9db01f10e5" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.101035 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" event={"ID":"0e153059-08c6-4155-af14-f724a156b6fd","Type":"ContainerStarted","Data":"569f1142eb2d456fef05801d4764616422956d5a0306830236208478663eb264"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.103562 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d23bccd7-14f7-419d-95db-38470afb02b0","Type":"ContainerStarted","Data":"472881f2fea3d4c190c7a71d3688c49816c3b38f082a33ad3a8d0a2b42a985cc"} Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.147633 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.178477 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.292169 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5hl\" (UniqueName: \"kubernetes.io/projected/38b00521-3bad-4a3b-b706-efd326d22495-kube-api-access-vv5hl\") pod \"38b00521-3bad-4a3b-b706-efd326d22495\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.292298 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-dns-svc\") pod \"1560f7fc-7396-480e-9b67-e62ccdf2b299\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.292382 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qp9h\" (UniqueName: \"kubernetes.io/projected/1560f7fc-7396-480e-9b67-e62ccdf2b299-kube-api-access-6qp9h\") pod \"1560f7fc-7396-480e-9b67-e62ccdf2b299\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.292498 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b00521-3bad-4a3b-b706-efd326d22495-config\") pod \"38b00521-3bad-4a3b-b706-efd326d22495\" (UID: \"38b00521-3bad-4a3b-b706-efd326d22495\") " Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.292517 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-config\") pod \"1560f7fc-7396-480e-9b67-e62ccdf2b299\" (UID: \"1560f7fc-7396-480e-9b67-e62ccdf2b299\") " Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.293079 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1560f7fc-7396-480e-9b67-e62ccdf2b299" (UID: "1560f7fc-7396-480e-9b67-e62ccdf2b299"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.293486 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.293785 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b00521-3bad-4a3b-b706-efd326d22495-config" (OuterVolumeSpecName: "config") pod "38b00521-3bad-4a3b-b706-efd326d22495" (UID: "38b00521-3bad-4a3b-b706-efd326d22495"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.293924 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-config" (OuterVolumeSpecName: "config") pod "1560f7fc-7396-480e-9b67-e62ccdf2b299" (UID: "1560f7fc-7396-480e-9b67-e62ccdf2b299"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.300074 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1560f7fc-7396-480e-9b67-e62ccdf2b299-kube-api-access-6qp9h" (OuterVolumeSpecName: "kube-api-access-6qp9h") pod "1560f7fc-7396-480e-9b67-e62ccdf2b299" (UID: "1560f7fc-7396-480e-9b67-e62ccdf2b299"). InnerVolumeSpecName "kube-api-access-6qp9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.314040 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b00521-3bad-4a3b-b706-efd326d22495-kube-api-access-vv5hl" (OuterVolumeSpecName: "kube-api-access-vv5hl") pod "38b00521-3bad-4a3b-b706-efd326d22495" (UID: "38b00521-3bad-4a3b-b706-efd326d22495"). InnerVolumeSpecName "kube-api-access-vv5hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.395303 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5hl\" (UniqueName: \"kubernetes.io/projected/38b00521-3bad-4a3b-b706-efd326d22495-kube-api-access-vv5hl\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.395568 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qp9h\" (UniqueName: \"kubernetes.io/projected/1560f7fc-7396-480e-9b67-e62ccdf2b299-kube-api-access-6qp9h\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.395579 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b00521-3bad-4a3b-b706-efd326d22495-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.395588 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1560f7fc-7396-480e-9b67-e62ccdf2b299-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:03 crc kubenswrapper[4762]: I0217 14:28:03.625638 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7gshj"] Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.122262 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"391886d8-341f-4e66-980c-00f6cd881e10","Type":"ContainerStarted","Data":"dc49693b749ed728999eb0a6e332ef87ee14582e4d7a57b7a32aec2d07dd0888"} Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.126870 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d5bd55fbc-55znb" event={"ID":"0ac8722c-6cf3-4581-8107-ae03a6198beb","Type":"ContainerStarted","Data":"92ff2292fcd5b74cdf25e9e44fa4d077308e9f2b685959c8469ac51f2cffd079"} Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.128975 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7gshj" event={"ID":"549db29e-a842-49dc-8b6b-1fe3f83857da","Type":"ContainerStarted","Data":"8e9611d0f2003d331a1c8f113cc0042c8912ba324e9be0af52cb7a21f4b1630d"} Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.129068 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qq4lx" Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.129120 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggqhx" Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.181527 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d5bd55fbc-55znb" podStartSLOduration=21.181506921 podStartE2EDuration="21.181506921s" podCreationTimestamp="2026-02-17 14:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:04.168771685 +0000 UTC m=+1364.748772357" watchObservedRunningTime="2026-02-17 14:28:04.181506921 +0000 UTC m=+1364.761507573" Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.226853 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggqhx"] Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.254477 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggqhx"] Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.301168 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qq4lx"] Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.322334 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qq4lx"] Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.396276 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.396320 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.407694 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:28:04 crc kubenswrapper[4762]: I0217 14:28:04.709459 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 14:28:05 crc kubenswrapper[4762]: I0217 14:28:05.174050 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d5bd55fbc-55znb" Feb 17 14:28:05 crc kubenswrapper[4762]: I0217 14:28:05.252556 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77f76d465c-nhgvb"] Feb 17 14:28:06 crc kubenswrapper[4762]: I0217 14:28:06.084558 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1560f7fc-7396-480e-9b67-e62ccdf2b299" path="/var/lib/kubelet/pods/1560f7fc-7396-480e-9b67-e62ccdf2b299/volumes" Feb 17 14:28:06 crc kubenswrapper[4762]: I0217 14:28:06.085210 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b00521-3bad-4a3b-b706-efd326d22495" path="/var/lib/kubelet/pods/38b00521-3bad-4a3b-b706-efd326d22495/volumes" Feb 17 14:28:06 crc kubenswrapper[4762]: I0217 14:28:06.235639 4762 generic.go:334] "Generic (PLEG): container finished" podID="bbd5850c-1106-4dd4-a7d7-b13e08eff2f5" containerID="243b4545053a8b88d1eb353d8420301b97eb24c9ab8c666136a01d7f4a2f7516" exitCode=0 Feb 17 14:28:06 crc kubenswrapper[4762]: I0217 14:28:06.235919 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5","Type":"ContainerDied","Data":"243b4545053a8b88d1eb353d8420301b97eb24c9ab8c666136a01d7f4a2f7516"} Feb 17 14:28:06 crc kubenswrapper[4762]: I0217 14:28:06.239981 4762 generic.go:334] "Generic (PLEG): container finished" podID="3fe6d960-8cae-47d2-86e7-c077f0facaae" containerID="a7d1866c07724dad6fd9a89269b470fc497ea5d389364c6880bc74199b76851b" exitCode=0 Feb 17 14:28:06 crc kubenswrapper[4762]: I0217 14:28:06.241115 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fe6d960-8cae-47d2-86e7-c077f0facaae","Type":"ContainerDied","Data":"a7d1866c07724dad6fd9a89269b470fc497ea5d389364c6880bc74199b76851b"} Feb 17 14:28:06 crc kubenswrapper[4762]: W0217 14:28:06.815764 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde4ebcd7_ede5_4a4a_aed5_55d31eee13bf.slice/crio-72537e3f689fe6d81d9a0d85b333f491a334ad627e55069bd3be27c5e1903af8 WatchSource:0}: Error finding container 72537e3f689fe6d81d9a0d85b333f491a334ad627e55069bd3be27c5e1903af8: Status 404 returned error can't find the container with id 72537e3f689fe6d81d9a0d85b333f491a334ad627e55069bd3be27c5e1903af8 Feb 17 14:28:07 crc kubenswrapper[4762]: I0217 14:28:07.263786 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf","Type":"ContainerStarted","Data":"72537e3f689fe6d81d9a0d85b333f491a334ad627e55069bd3be27c5e1903af8"} Feb 17 14:28:07 crc kubenswrapper[4762]: I0217 14:28:07.480853 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.312845 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fe6d960-8cae-47d2-86e7-c077f0facaae","Type":"ContainerStarted","Data":"407e2070b81595aa36ee3ec83e3cd654bf6871b3772261857f0dd9fd5eab5dc1"} Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.569469 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7q75w"] Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.629231 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=13.699545107 podStartE2EDuration="36.629204561s" podCreationTimestamp="2026-02-17 14:27:34 +0000 UTC" firstStartedPulling="2026-02-17 14:27:37.894205912 +0000 UTC m=+1338.474206564" lastFinishedPulling="2026-02-17 14:28:00.823865366 +0000 UTC m=+1361.403866018" observedRunningTime="2026-02-17 14:28:10.606881394 +0000 UTC m=+1371.186882046" watchObservedRunningTime="2026-02-17 14:28:10.629204561 +0000 UTC m=+1371.209205213" Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.661607 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7sbz9"] Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.664744 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.722599 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftp2\" (UniqueName: \"kubernetes.io/projected/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-kube-api-access-4ftp2\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.722763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-config\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.722956 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:10 crc kubenswrapper[4762]: I0217 14:28:10.751264 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7sbz9"] Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.010376 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-config\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.010478 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.011691 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ftp2\" (UniqueName: \"kubernetes.io/projected/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-kube-api-access-4ftp2\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.012631 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.013107 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-config\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.234470 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ftp2\" (UniqueName: \"kubernetes.io/projected/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-kube-api-access-4ftp2\") pod \"dnsmasq-dns-7cb5889db5-7sbz9\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.372653 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.432871 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b848d44f-ad87-4491-a0af-c2028ee1827b","Type":"ContainerStarted","Data":"2d7076b81feeeb34587e41e0e18cea23c01f61f076c04d8b4374bd103fd640e7"} Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.455273 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" event={"ID":"0e153059-08c6-4155-af14-f724a156b6fd","Type":"ContainerStarted","Data":"ebd05a7a080b33fdba08ed47912eb0b3fd14d4c8d80c76c8b820def2f6ba2aac"} Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.478589 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-656mp" podStartSLOduration=21.998321401 podStartE2EDuration="28.478566215s" podCreationTimestamp="2026-02-17 14:27:43 +0000 UTC" firstStartedPulling="2026-02-17 14:28:02.533706706 +0000 UTC m=+1363.113707358" lastFinishedPulling="2026-02-17 14:28:09.01395152 +0000 UTC m=+1369.593952172" observedRunningTime="2026-02-17 14:28:11.477022603 +0000 UTC m=+1372.057023255" watchObservedRunningTime="2026-02-17 14:28:11.478566215 +0000 UTC m=+1372.058566867" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.839341 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.969482 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-dns-svc\") pod \"de8fe6a0-5c88-434f-a653-ee334a757900\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.969758 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-config\") pod \"de8fe6a0-5c88-434f-a653-ee334a757900\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.969880 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxgmh\" (UniqueName: \"kubernetes.io/projected/de8fe6a0-5c88-434f-a653-ee334a757900-kube-api-access-lxgmh\") pod \"de8fe6a0-5c88-434f-a653-ee334a757900\" (UID: \"de8fe6a0-5c88-434f-a653-ee334a757900\") " Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.976617 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8fe6a0-5c88-434f-a653-ee334a757900-kube-api-access-lxgmh" (OuterVolumeSpecName: "kube-api-access-lxgmh") pod "de8fe6a0-5c88-434f-a653-ee334a757900" (UID: "de8fe6a0-5c88-434f-a653-ee334a757900"). InnerVolumeSpecName "kube-api-access-lxgmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.977124 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de8fe6a0-5c88-434f-a653-ee334a757900" (UID: "de8fe6a0-5c88-434f-a653-ee334a757900"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:11 crc kubenswrapper[4762]: I0217 14:28:11.977492 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-config" (OuterVolumeSpecName: "config") pod "de8fe6a0-5c88-434f-a653-ee334a757900" (UID: "de8fe6a0-5c88-434f-a653-ee334a757900"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.071921 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxgmh\" (UniqueName: \"kubernetes.io/projected/de8fe6a0-5c88-434f-a653-ee334a757900-kube-api-access-lxgmh\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.071959 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.071970 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8fe6a0-5c88-434f-a653-ee334a757900-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.309915 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7sbz9"] Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.356991 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.365018 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.379903 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.380114 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.380149 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.380241 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-djg2k" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.401449 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.434816 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-674vl"] Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.436346 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.439178 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.439192 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.439420 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.446352 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-674vl"] Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.466935 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6d19ed64-87e9-4afd-9c02-4319baed9bda","Type":"ContainerStarted","Data":"8d3fbee898bdd4c5f8b01484c224574c540d666bff1c4ba85cf0894b8064fa05"} Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.466996 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.468284 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" event={"ID":"de8fe6a0-5c88-434f-a653-ee334a757900","Type":"ContainerDied","Data":"54fa9b45b56eced700a20d20f473dcfe758357fa3c8788ebd5c466d59cad9d20"} Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.468354 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7q75w" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.473261 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bbd5850c-1106-4dd4-a7d7-b13e08eff2f5","Type":"ContainerStarted","Data":"a03b6b76380fe995378349b2d3c52e2feee0c9680a0abbd7f6912fcd70381c5b"} Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.476185 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf","Type":"ContainerStarted","Data":"fcad87b442bca7462ca397ff82d24ee643816c8484ea0399461573c691368c3e"} Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.478550 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7gshj" event={"ID":"549db29e-a842-49dc-8b6b-1fe3f83857da","Type":"ContainerStarted","Data":"c894eecd8165adec5a2fc363acc06c57668a7fe8f84deccf315b0c400111f447"} Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.485772 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=26.156147238 podStartE2EDuration="33.485753866s" podCreationTimestamp="2026-02-17 14:27:39 +0000 UTC" firstStartedPulling="2026-02-17 14:28:02.527373964 +0000 UTC m=+1363.107374616" lastFinishedPulling="2026-02-17 14:28:09.856980592 +0000 UTC m=+1370.436981244" observedRunningTime="2026-02-17 14:28:12.483956437 +0000 UTC m=+1373.063957079" watchObservedRunningTime="2026-02-17 14:28:12.485753866 +0000 UTC m=+1373.065754518" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.486962 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft" event={"ID":"0611dcb7-08c7-4999-8bc2-210224f89e66","Type":"ContainerStarted","Data":"f3dbe5b3e396203aeef1775e78b8da55df7f5194080332e6ae79cfc9e406ad92"} Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.488078 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xspft" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.505088 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a7dc3-63d2-4995-ab6f-712df183303d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.505346 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwws\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-kube-api-access-bxwws\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.505468 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.505575 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/466a7dc3-63d2-4995-ab6f-712df183303d-cache\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.505747 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.505829 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/466a7dc3-63d2-4995-ab6f-712df183303d-lock\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: W0217 14:28:12.518371 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b70f06_d85e_428d_87c1_1e9ab9ea991b.slice/crio-1a3f56da229b59f814192ad405d19dc94396eda44ebd7d92583fac6cd07cee66 WatchSource:0}: Error finding container 1a3f56da229b59f814192ad405d19dc94396eda44ebd7d92583fac6cd07cee66: Status 404 returned error can't find the container with id 1a3f56da229b59f814192ad405d19dc94396eda44ebd7d92583fac6cd07cee66 Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.553829 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.98003314 podStartE2EDuration="37.553807555s" podCreationTimestamp="2026-02-17 14:27:35 +0000 UTC" firstStartedPulling="2026-02-17 14:27:46.549620798 +0000 UTC m=+1347.129621450" lastFinishedPulling="2026-02-17 14:28:01.123395213 +0000 UTC m=+1361.703395865" observedRunningTime="2026-02-17 14:28:12.544916133 +0000 UTC m=+1373.124916785" watchObservedRunningTime="2026-02-17 14:28:12.553807555 +0000 UTC m=+1373.133808237" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.575302 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xspft" podStartSLOduration=23.403086446 podStartE2EDuration="30.575282738s" podCreationTimestamp="2026-02-17 14:27:42 +0000 UTC" firstStartedPulling="2026-02-17 14:28:01.545080818 +0000 UTC m=+1362.125081470" lastFinishedPulling="2026-02-17 14:28:08.71727711 +0000 UTC m=+1369.297277762" observedRunningTime="2026-02-17 14:28:12.566934191 +0000 UTC m=+1373.146934843" watchObservedRunningTime="2026-02-17 14:28:12.575282738 +0000 UTC m=+1373.155283390" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608131 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6083b27-9cd4-494a-8b51-9dff95918001-etc-swift\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608188 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-swiftconf\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/466a7dc3-63d2-4995-ab6f-712df183303d-cache\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608376 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608408 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/466a7dc3-63d2-4995-ab6f-712df183303d-lock\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-dispersionconf\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608551 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-scripts\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjncg\" (UniqueName: \"kubernetes.io/projected/f6083b27-9cd4-494a-8b51-9dff95918001-kube-api-access-zjncg\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608696 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-combined-ca-bundle\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608752 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a7dc3-63d2-4995-ab6f-712df183303d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608908 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwws\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-kube-api-access-bxwws\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608952 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-ring-data-devices\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.608999 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/466a7dc3-63d2-4995-ab6f-712df183303d-cache\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.609036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: E0217 14:28:12.609559 4762 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:28:12 crc kubenswrapper[4762]: E0217 14:28:12.609592 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:28:12 crc kubenswrapper[4762]: E0217 14:28:12.609662 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift podName:466a7dc3-63d2-4995-ab6f-712df183303d nodeName:}" failed. No retries permitted until 2026-02-17 14:28:13.109631291 +0000 UTC m=+1373.689631943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift") pod "swift-storage-0" (UID: "466a7dc3-63d2-4995-ab6f-712df183303d") : configmap "swift-ring-files" not found Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.610399 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/466a7dc3-63d2-4995-ab6f-712df183303d-lock\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.616611 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.616676 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a96647c732d4d81c1e6a4498afb19ed55e8a02c13a20078f16f12c8890071b14/globalmount\"" pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.755517 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a7dc3-63d2-4995-ab6f-712df183303d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.757686 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-combined-ca-bundle\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.755678 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwws\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-kube-api-access-bxwws\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.757902 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-ring-data-devices\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.758351 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6083b27-9cd4-494a-8b51-9dff95918001-etc-swift\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.758402 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-swiftconf\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.758523 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-dispersionconf\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.758579 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-scripts\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.758654 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjncg\" (UniqueName: \"kubernetes.io/projected/f6083b27-9cd4-494a-8b51-9dff95918001-kube-api-access-zjncg\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.760234 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-ring-data-devices\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.763881 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6083b27-9cd4-494a-8b51-9dff95918001-etc-swift\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.765802 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-scripts\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.922206 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-dispersionconf\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.922387 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjncg\" (UniqueName: \"kubernetes.io/projected/f6083b27-9cd4-494a-8b51-9dff95918001-kube-api-access-zjncg\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.923049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-swiftconf\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.925196 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-combined-ca-bundle\") pod \"swift-ring-rebalance-674vl\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:12 crc kubenswrapper[4762]: I0217 14:28:12.970118 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce4e4c52-9c0e-4b36-a541-2697dfcae3d8\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.170581 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:13 crc kubenswrapper[4762]: E0217 14:28:13.170778 4762 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:28:13 crc kubenswrapper[4762]: E0217 14:28:13.170823 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:28:13 crc kubenswrapper[4762]: E0217 14:28:13.170895 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift podName:466a7dc3-63d2-4995-ab6f-712df183303d nodeName:}" failed. No retries permitted until 2026-02-17 14:28:14.170877118 +0000 UTC m=+1374.750877760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift") pod "swift-storage-0" (UID: "466a7dc3-63d2-4995-ab6f-712df183303d") : configmap "swift-ring-files" not found Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.453619 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.468926 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7q75w"] Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.475489 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7q75w"] Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.501207 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" event={"ID":"b7b70f06-d85e-428d-87c1-1e9ab9ea991b","Type":"ContainerStarted","Data":"1a3f56da229b59f814192ad405d19dc94396eda44ebd7d92583fac6cd07cee66"} Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.504404 4762 generic.go:334] "Generic (PLEG): container finished" podID="549db29e-a842-49dc-8b6b-1fe3f83857da" containerID="c894eecd8165adec5a2fc363acc06c57668a7fe8f84deccf315b0c400111f447" exitCode=0 Feb 17 14:28:13 crc kubenswrapper[4762]: I0217 14:28:13.506149 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7gshj" event={"ID":"549db29e-a842-49dc-8b6b-1fe3f83857da","Type":"ContainerDied","Data":"c894eecd8165adec5a2fc363acc06c57668a7fe8f84deccf315b0c400111f447"} Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.085583 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8fe6a0-5c88-434f-a653-ee334a757900" path="/var/lib/kubelet/pods/de8fe6a0-5c88-434f-a653-ee334a757900/volumes" Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.214383 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:14 crc kubenswrapper[4762]: E0217 14:28:14.214599 4762 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:28:14 crc kubenswrapper[4762]: E0217 14:28:14.214630 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:28:14 crc kubenswrapper[4762]: E0217 14:28:14.214715 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift podName:466a7dc3-63d2-4995-ab6f-712df183303d nodeName:}" failed. No retries permitted until 2026-02-17 14:28:16.214693395 +0000 UTC m=+1376.794694047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift") pod "swift-storage-0" (UID: "466a7dc3-63d2-4995-ab6f-712df183303d") : configmap "swift-ring-files" not found Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.541119 4762 generic.go:334] "Generic (PLEG): container finished" podID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerID="6f50fca5d365a886f57ce6e4f5bbba7aeea2375871a92ac33133593a10ea6585" exitCode=0 Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.541287 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sml78" event={"ID":"64dd25ca-1eee-49de-9efd-611c90acb3e2","Type":"ContainerDied","Data":"6f50fca5d365a886f57ce6e4f5bbba7aeea2375871a92ac33133593a10ea6585"} Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.545339 4762 generic.go:334] "Generic (PLEG): container finished" podID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerID="7c85e290cfa1d8e2cd6a9ba2bf52a7b38e1e01ec5ec04fea8887436318293b33" exitCode=0 Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.545633 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" event={"ID":"b7b70f06-d85e-428d-87c1-1e9ab9ea991b","Type":"ContainerDied","Data":"7c85e290cfa1d8e2cd6a9ba2bf52a7b38e1e01ec5ec04fea8887436318293b33"} Feb 17 14:28:14 crc kubenswrapper[4762]: I0217 14:28:14.915856 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-674vl"] Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.556902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de4ebcd7-ede5-4a4a-aed5-55d31eee13bf","Type":"ContainerStarted","Data":"71aa4a38368335899520480d26a53439e8eb9f5cbb9af3205e3ae52e2f8bc905"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.561807 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerStarted","Data":"26eac05bc40a7e99203d2d5e5eda0e1ea377002924f146a145f67079e2beb4d3"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.563748 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sml78" event={"ID":"64dd25ca-1eee-49de-9efd-611c90acb3e2","Type":"ContainerStarted","Data":"f9efe15c028902c4240441e3de3d9f849e03e9a60e2f20aea458d5f1105022a3"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.563932 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.566477 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7gshj" event={"ID":"549db29e-a842-49dc-8b6b-1fe3f83857da","Type":"ContainerStarted","Data":"453bca772c729f205e614055feba42488ea2fd8834ccf4dab039b517d388dc2b"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.566523 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7gshj" event={"ID":"549db29e-a842-49dc-8b6b-1fe3f83857da","Type":"ContainerStarted","Data":"15441c3e697e80af33d95f0d9095aca0f53b197426afe4bb92fadffb64f7a1a5"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.566541 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.566552 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.570401 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-674vl" event={"ID":"f6083b27-9cd4-494a-8b51-9dff95918001","Type":"ContainerStarted","Data":"f30206ad5ce38da61bd96c1041ac042820038fa596c06cde0eed4a4875393d92"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.575019 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b848d44f-ad87-4491-a0af-c2028ee1827b","Type":"ContainerStarted","Data":"b21e7c9edb7271190989fd21c8ac7ce25b89e4443486942fe3c996fd12b881a1"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.581588 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" event={"ID":"b7b70f06-d85e-428d-87c1-1e9ab9ea991b","Type":"ContainerStarted","Data":"024be554b5bcd401984aa8441fff199e72202fb6c84a7f6704cf123d758aa475"} Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.581838 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.586783 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=26.882060222 podStartE2EDuration="34.5867705s" podCreationTimestamp="2026-02-17 14:27:41 +0000 UTC" firstStartedPulling="2026-02-17 14:28:06.818641812 +0000 UTC m=+1367.398642464" lastFinishedPulling="2026-02-17 14:28:14.52335209 +0000 UTC m=+1375.103352742" observedRunningTime="2026-02-17 14:28:15.579664937 +0000 UTC m=+1376.159665599" watchObservedRunningTime="2026-02-17 14:28:15.5867705 +0000 UTC m=+1376.166771152" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.621386 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7gshj" podStartSLOduration=28.931543096 podStartE2EDuration="33.62136592s" podCreationTimestamp="2026-02-17 14:27:42 +0000 UTC" firstStartedPulling="2026-02-17 14:28:04.027415655 +0000 UTC m=+1364.607416307" lastFinishedPulling="2026-02-17 14:28:08.717238479 +0000 UTC m=+1369.297239131" observedRunningTime="2026-02-17 14:28:15.61180663 +0000 UTC m=+1376.191807302" watchObservedRunningTime="2026-02-17 14:28:15.62136592 +0000 UTC m=+1376.201366592" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.645207 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.043767499 podStartE2EDuration="30.645180087s" podCreationTimestamp="2026-02-17 14:27:45 +0000 UTC" firstStartedPulling="2026-02-17 14:28:02.805927581 +0000 UTC m=+1363.385928233" lastFinishedPulling="2026-02-17 14:28:14.407340179 +0000 UTC m=+1374.987340821" observedRunningTime="2026-02-17 14:28:15.632541053 +0000 UTC m=+1376.212541705" watchObservedRunningTime="2026-02-17 14:28:15.645180087 +0000 UTC m=+1376.225180759" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.805558 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-sml78" podStartSLOduration=5.580376305 podStartE2EDuration="43.805532462s" podCreationTimestamp="2026-02-17 14:27:32 +0000 UTC" firstStartedPulling="2026-02-17 14:27:33.687937503 +0000 UTC m=+1334.267938155" lastFinishedPulling="2026-02-17 14:28:11.91309366 +0000 UTC m=+1372.493094312" observedRunningTime="2026-02-17 14:28:15.796694312 +0000 UTC m=+1376.376694964" watchObservedRunningTime="2026-02-17 14:28:15.805532462 +0000 UTC m=+1376.385533114" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.827094 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" podStartSLOduration=5.827070657 podStartE2EDuration="5.827070657s" podCreationTimestamp="2026-02-17 14:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:15.823659944 +0000 UTC m=+1376.403660606" watchObservedRunningTime="2026-02-17 14:28:15.827070657 +0000 UTC m=+1376.407071309" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.916151 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 14:28:15 crc kubenswrapper[4762]: I0217 14:28:15.916916 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 14:28:16 crc kubenswrapper[4762]: I0217 14:28:16.253837 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:16 crc kubenswrapper[4762]: E0217 14:28:16.254230 4762 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:28:16 crc kubenswrapper[4762]: E0217 14:28:16.254254 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:28:16 crc kubenswrapper[4762]: E0217 14:28:16.254304 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift podName:466a7dc3-63d2-4995-ab6f-712df183303d nodeName:}" failed. No retries permitted until 2026-02-17 14:28:20.254287633 +0000 UTC m=+1380.834288285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift") pod "swift-storage-0" (UID: "466a7dc3-63d2-4995-ab6f-712df183303d") : configmap "swift-ring-files" not found Feb 17 14:28:16 crc kubenswrapper[4762]: I0217 14:28:16.867329 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 14:28:16 crc kubenswrapper[4762]: I0217 14:28:16.930074 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.012524 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.012617 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.095954 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.584123 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.584171 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.691019 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.738130 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.750899 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.931843 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sml78"] Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.933474 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-sml78" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="dnsmasq-dns" containerID="cri-o://f9efe15c028902c4240441e3de3d9f849e03e9a60e2f20aea458d5f1105022a3" gracePeriod=10 Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.975470 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7s7b5"] Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.978993 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.983059 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 14:28:17 crc kubenswrapper[4762]: I0217 14:28:17.990232 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7s7b5"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.326948 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c6069ca-94f7-439c-9434-0d79b4e56500-config\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.327024 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3c6069ca-94f7-439c-9434-0d79b4e56500-ovn-rundir\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.327140 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97t2\" (UniqueName: \"kubernetes.io/projected/3c6069ca-94f7-439c-9434-0d79b4e56500-kube-api-access-m97t2\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.327198 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3c6069ca-94f7-439c-9434-0d79b4e56500-ovs-rundir\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.327260 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6069ca-94f7-439c-9434-0d79b4e56500-combined-ca-bundle\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.327396 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6069ca-94f7-439c-9434-0d79b4e56500-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.392002 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-2jm8z"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.395382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.414084 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.424515 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-2jm8z"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.778873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97t2\" (UniqueName: \"kubernetes.io/projected/3c6069ca-94f7-439c-9434-0d79b4e56500-kube-api-access-m97t2\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.779692 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3c6069ca-94f7-439c-9434-0d79b4e56500-ovs-rundir\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.779812 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6069ca-94f7-439c-9434-0d79b4e56500-combined-ca-bundle\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.780160 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6069ca-94f7-439c-9434-0d79b4e56500-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.781498 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3c6069ca-94f7-439c-9434-0d79b4e56500-ovs-rundir\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.787000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c6069ca-94f7-439c-9434-0d79b4e56500-config\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.788394 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.790279 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.795001 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c6069ca-94f7-439c-9434-0d79b4e56500-config\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.795074 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3c6069ca-94f7-439c-9434-0d79b4e56500-ovn-rundir\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.795211 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3c6069ca-94f7-439c-9434-0d79b4e56500-ovn-rundir\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.799457 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.799949 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.800198 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6069ca-94f7-439c-9434-0d79b4e56500-combined-ca-bundle\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.800353 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.800569 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-67p52" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.799578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6069ca-94f7-439c-9434-0d79b4e56500-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.806920 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97t2\" (UniqueName: \"kubernetes.io/projected/3c6069ca-94f7-439c-9434-0d79b4e56500-kube-api-access-m97t2\") pod \"ovn-controller-metrics-7s7b5\" (UID: \"3c6069ca-94f7-439c-9434-0d79b4e56500\") " pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.807486 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sml78" event={"ID":"64dd25ca-1eee-49de-9efd-611c90acb3e2","Type":"ContainerDied","Data":"f9efe15c028902c4240441e3de3d9f849e03e9a60e2f20aea458d5f1105022a3"} Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.807517 4762 generic.go:334] "Generic (PLEG): container finished" podID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerID="f9efe15c028902c4240441e3de3d9f849e03e9a60e2f20aea458d5f1105022a3" exitCode=0 Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.836853 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.865337 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7sbz9"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.866009 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="dnsmasq-dns" containerID="cri-o://024be554b5bcd401984aa8441fff199e72202fb6c84a7f6704cf123d758aa475" gracePeriod=10 Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.877702 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vmhb8"] Feb 17 14:28:18 crc kubenswrapper[4762]: I0217 14:28:18.897271 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-config\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124468 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124578 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124621 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpcgg\" (UniqueName: \"kubernetes.io/projected/35249c1a-ea4f-419c-91be-dfee3dbf3303-kube-api-access-mpcgg\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124700 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124745 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124774 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35249c1a-ea4f-419c-91be-dfee3dbf3303-config\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124862 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znppk\" (UniqueName: \"kubernetes.io/projected/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-kube-api-access-znppk\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124924 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35249c1a-ea4f-419c-91be-dfee3dbf3303-scripts\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124941 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35249c1a-ea4f-419c-91be-dfee3dbf3303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.124984 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:18.930945 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7s7b5" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.146619 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vmhb8"] Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.147677 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.156238 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226680 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226773 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226798 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35249c1a-ea4f-419c-91be-dfee3dbf3303-config\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226841 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226878 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-config\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226919 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.226970 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znppk\" (UniqueName: \"kubernetes.io/projected/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-kube-api-access-znppk\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227025 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227098 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35249c1a-ea4f-419c-91be-dfee3dbf3303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227115 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35249c1a-ea4f-419c-91be-dfee3dbf3303-scripts\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227149 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227266 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-config\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227298 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txhqr\" (UniqueName: \"kubernetes.io/projected/366b755e-ebe1-4687-861b-39bb7892755a-kube-api-access-txhqr\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227342 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.227403 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpcgg\" (UniqueName: \"kubernetes.io/projected/35249c1a-ea4f-419c-91be-dfee3dbf3303-kube-api-access-mpcgg\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.229879 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.230632 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35249c1a-ea4f-419c-91be-dfee3dbf3303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.230679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.230959 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-config\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.231915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35249c1a-ea4f-419c-91be-dfee3dbf3303-config\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.231569 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35249c1a-ea4f-419c-91be-dfee3dbf3303-scripts\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.236006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.240047 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.249526 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpcgg\" (UniqueName: \"kubernetes.io/projected/35249c1a-ea4f-419c-91be-dfee3dbf3303-kube-api-access-mpcgg\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.262084 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/35249c1a-ea4f-419c-91be-dfee3dbf3303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"35249c1a-ea4f-419c-91be-dfee3dbf3303\") " pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.273413 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znppk\" (UniqueName: \"kubernetes.io/projected/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-kube-api-access-znppk\") pod \"dnsmasq-dns-8cc7fc4dc-2jm8z\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.499329 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.499942 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.504282 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.509315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.509454 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-config\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.510184 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-config\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.510311 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.510414 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.513503 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.514082 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.510657 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txhqr\" (UniqueName: \"kubernetes.io/projected/366b755e-ebe1-4687-861b-39bb7892755a-kube-api-access-txhqr\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.537128 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.546536 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txhqr\" (UniqueName: \"kubernetes.io/projected/366b755e-ebe1-4687-861b-39bb7892755a-kube-api-access-txhqr\") pod \"dnsmasq-dns-b8fbc5445-vmhb8\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.709933 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 14:28:19 crc kubenswrapper[4762]: I0217 14:28:19.981873 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:20 crc kubenswrapper[4762]: I0217 14:28:20.003896 4762 generic.go:334] "Generic (PLEG): container finished" podID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerID="024be554b5bcd401984aa8441fff199e72202fb6c84a7f6704cf123d758aa475" exitCode=0 Feb 17 14:28:20 crc kubenswrapper[4762]: I0217 14:28:20.003968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" event={"ID":"b7b70f06-d85e-428d-87c1-1e9ab9ea991b","Type":"ContainerDied","Data":"024be554b5bcd401984aa8441fff199e72202fb6c84a7f6704cf123d758aa475"} Feb 17 14:28:20 crc kubenswrapper[4762]: I0217 14:28:20.403327 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 14:28:20 crc kubenswrapper[4762]: I0217 14:28:20.405894 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:20 crc kubenswrapper[4762]: E0217 14:28:20.409070 4762 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:28:20 crc kubenswrapper[4762]: E0217 14:28:20.409101 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:28:20 crc kubenswrapper[4762]: E0217 14:28:20.409141 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift podName:466a7dc3-63d2-4995-ab6f-712df183303d nodeName:}" failed. No retries permitted until 2026-02-17 14:28:28.409126665 +0000 UTC m=+1388.989127317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift") pod "swift-storage-0" (UID: "466a7dc3-63d2-4995-ab6f-712df183303d") : configmap "swift-ring-files" not found Feb 17 14:28:21 crc kubenswrapper[4762]: I0217 14:28:21.413092 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.105380 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-sml78" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.163231 4762 generic.go:334] "Generic (PLEG): container finished" podID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerID="26eac05bc40a7e99203d2d5e5eda0e1ea377002924f146a145f67079e2beb4d3" exitCode=0 Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.163389 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerDied","Data":"26eac05bc40a7e99203d2d5e5eda0e1ea377002924f146a145f67079e2beb4d3"} Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.377374 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-1559-account-create-update-562bx"] Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.384879 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.392003 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.392724 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-1559-account-create-update-562bx"] Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.459627 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.503078 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-dns-svc\") pod \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.503262 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-config\") pod \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.504158 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ftp2\" (UniqueName: \"kubernetes.io/projected/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-kube-api-access-4ftp2\") pod \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\" (UID: \"b7b70f06-d85e-428d-87c1-1e9ab9ea991b\") " Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.504696 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztl8r\" (UniqueName: \"kubernetes.io/projected/60202600-f7cc-4623-abf8-d3f1ad5662aa-kube-api-access-ztl8r\") pod \"mysqld-exporter-1559-account-create-update-562bx\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.504821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60202600-f7cc-4623-abf8-d3f1ad5662aa-operator-scripts\") pod \"mysqld-exporter-1559-account-create-update-562bx\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.509887 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-kube-api-access-4ftp2" (OuterVolumeSpecName: "kube-api-access-4ftp2") pod "b7b70f06-d85e-428d-87c1-1e9ab9ea991b" (UID: "b7b70f06-d85e-428d-87c1-1e9ab9ea991b"). InnerVolumeSpecName "kube-api-access-4ftp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.558393 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7b70f06-d85e-428d-87c1-1e9ab9ea991b" (UID: "b7b70f06-d85e-428d-87c1-1e9ab9ea991b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.567090 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.567437 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-config" (OuterVolumeSpecName: "config") pod "b7b70f06-d85e-428d-87c1-1e9ab9ea991b" (UID: "b7b70f06-d85e-428d-87c1-1e9ab9ea991b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.606895 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-dns-svc\") pod \"64dd25ca-1eee-49de-9efd-611c90acb3e2\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.606947 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-config\") pod \"64dd25ca-1eee-49de-9efd-611c90acb3e2\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.607171 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx5mk\" (UniqueName: \"kubernetes.io/projected/64dd25ca-1eee-49de-9efd-611c90acb3e2-kube-api-access-nx5mk\") pod \"64dd25ca-1eee-49de-9efd-611c90acb3e2\" (UID: \"64dd25ca-1eee-49de-9efd-611c90acb3e2\") " Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.607431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60202600-f7cc-4623-abf8-d3f1ad5662aa-operator-scripts\") pod \"mysqld-exporter-1559-account-create-update-562bx\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.607710 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztl8r\" (UniqueName: \"kubernetes.io/projected/60202600-f7cc-4623-abf8-d3f1ad5662aa-kube-api-access-ztl8r\") pod \"mysqld-exporter-1559-account-create-update-562bx\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.607832 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.607856 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ftp2\" (UniqueName: \"kubernetes.io/projected/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-kube-api-access-4ftp2\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.607870 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b70f06-d85e-428d-87c1-1e9ab9ea991b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.610503 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64dd25ca-1eee-49de-9efd-611c90acb3e2-kube-api-access-nx5mk" (OuterVolumeSpecName: "kube-api-access-nx5mk") pod "64dd25ca-1eee-49de-9efd-611c90acb3e2" (UID: "64dd25ca-1eee-49de-9efd-611c90acb3e2"). InnerVolumeSpecName "kube-api-access-nx5mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.663429 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60202600-f7cc-4623-abf8-d3f1ad5662aa-operator-scripts\") pod \"mysqld-exporter-1559-account-create-update-562bx\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.670959 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztl8r\" (UniqueName: \"kubernetes.io/projected/60202600-f7cc-4623-abf8-d3f1ad5662aa-kube-api-access-ztl8r\") pod \"mysqld-exporter-1559-account-create-update-562bx\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.702874 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64dd25ca-1eee-49de-9efd-611c90acb3e2" (UID: "64dd25ca-1eee-49de-9efd-611c90acb3e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.705143 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-config" (OuterVolumeSpecName: "config") pod "64dd25ca-1eee-49de-9efd-611c90acb3e2" (UID: "64dd25ca-1eee-49de-9efd-611c90acb3e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.712717 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx5mk\" (UniqueName: \"kubernetes.io/projected/64dd25ca-1eee-49de-9efd-611c90acb3e2-kube-api-access-nx5mk\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.712762 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.712774 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dd25ca-1eee-49de-9efd-611c90acb3e2-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.832330 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.842543 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.898243 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-2jm8z"] Feb 17 14:28:23 crc kubenswrapper[4762]: I0217 14:28:23.947924 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7s7b5"] Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:23.981591 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.124674 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vmhb8"] Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.130263 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 14:28:24 crc kubenswrapper[4762]: W0217 14:28:24.157405 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35249c1a_ea4f_419c_91be_dfee3dbf3303.slice/crio-6f48356f28ad21b1bdfd7100ce50adcd7e32d7d00b835e5fe029388a30a54040 WatchSource:0}: Error finding container 6f48356f28ad21b1bdfd7100ce50adcd7e32d7d00b835e5fe029388a30a54040: Status 404 returned error can't find the container with id 6f48356f28ad21b1bdfd7100ce50adcd7e32d7d00b835e5fe029388a30a54040 Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.194296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7s7b5" event={"ID":"3c6069ca-94f7-439c-9434-0d79b4e56500","Type":"ContainerStarted","Data":"334b639d78b80e2ba344cba0d743cca6615fc478e82bb4e67c0faa2f3009e63a"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.205741 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sml78" event={"ID":"64dd25ca-1eee-49de-9efd-611c90acb3e2","Type":"ContainerDied","Data":"20f0dc9c3e1911be779bf4b8004e0dcf1f9a0a6b58b0537b101abf6cfede345e"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.205796 4762 scope.go:117] "RemoveContainer" containerID="f9efe15c028902c4240441e3de3d9f849e03e9a60e2f20aea458d5f1105022a3" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.205946 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sml78" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.212471 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" event={"ID":"366b755e-ebe1-4687-861b-39bb7892755a","Type":"ContainerStarted","Data":"a4ace29e2d4b4ff9032bdaba7cfaf401d3b8141bca49195b8d712bb31790c124"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.243292 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bc7jm"] Feb 17 14:28:24 crc kubenswrapper[4762]: E0217 14:28:24.243823 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="init" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.243846 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="init" Feb 17 14:28:24 crc kubenswrapper[4762]: E0217 14:28:24.243881 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="dnsmasq-dns" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.243887 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="dnsmasq-dns" Feb 17 14:28:24 crc kubenswrapper[4762]: E0217 14:28:24.243895 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="dnsmasq-dns" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.243902 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="dnsmasq-dns" Feb 17 14:28:24 crc kubenswrapper[4762]: E0217 14:28:24.243913 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="init" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.243919 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="init" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.244169 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" containerName="dnsmasq-dns" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.244186 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" containerName="dnsmasq-dns" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.244937 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.248878 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.258441 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-674vl" event={"ID":"f6083b27-9cd4-494a-8b51-9dff95918001","Type":"ContainerStarted","Data":"b2eb1cacf9d0f15de18d722a7a6403b43eac80b656a1dc2f813ca4ccea1f3ded"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.265137 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sml78"] Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.279631 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sml78"] Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.281993 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" event={"ID":"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e","Type":"ContainerStarted","Data":"a2cbb03ad697a79ee14dced328e082da87373157cafbb1ebb8aee71e9f584e95"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.287872 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bc7jm"] Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.303215 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.303307 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7sbz9" event={"ID":"b7b70f06-d85e-428d-87c1-1e9ab9ea991b","Type":"ContainerDied","Data":"1a3f56da229b59f814192ad405d19dc94396eda44ebd7d92583fac6cd07cee66"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.306165 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-674vl" podStartSLOduration=4.464409231 podStartE2EDuration="12.306142652s" podCreationTimestamp="2026-02-17 14:28:12 +0000 UTC" firstStartedPulling="2026-02-17 14:28:14.931819987 +0000 UTC m=+1375.511820639" lastFinishedPulling="2026-02-17 14:28:22.773553408 +0000 UTC m=+1383.353554060" observedRunningTime="2026-02-17 14:28:24.296154131 +0000 UTC m=+1384.876154783" watchObservedRunningTime="2026-02-17 14:28:24.306142652 +0000 UTC m=+1384.886143304" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.328765 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"35249c1a-ea4f-419c-91be-dfee3dbf3303","Type":"ContainerStarted","Data":"6f48356f28ad21b1bdfd7100ce50adcd7e32d7d00b835e5fe029388a30a54040"} Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.911084 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d53aa13-0847-42e8-92f1-da4e51c714a7-operator-scripts\") pod \"root-account-create-update-bc7jm\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.911182 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbc9\" (UniqueName: \"kubernetes.io/projected/7d53aa13-0847-42e8-92f1-da4e51c714a7-kube-api-access-mxbc9\") pod \"root-account-create-update-bc7jm\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:24 crc kubenswrapper[4762]: I0217 14:28:24.984329 4762 scope.go:117] "RemoveContainer" containerID="6f50fca5d365a886f57ce6e4f5bbba7aeea2375871a92ac33133593a10ea6585" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.000046 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7sbz9"] Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.013099 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d53aa13-0847-42e8-92f1-da4e51c714a7-operator-scripts\") pod \"root-account-create-update-bc7jm\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.013209 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbc9\" (UniqueName: \"kubernetes.io/projected/7d53aa13-0847-42e8-92f1-da4e51c714a7-kube-api-access-mxbc9\") pod \"root-account-create-update-bc7jm\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.014793 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d53aa13-0847-42e8-92f1-da4e51c714a7-operator-scripts\") pod \"root-account-create-update-bc7jm\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.022776 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7sbz9"] Feb 17 14:28:25 crc kubenswrapper[4762]: W0217 14:28:25.032889 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60202600_f7cc_4623_abf8_d3f1ad5662aa.slice/crio-46a08305c674a126db468981d1924b51127c53db32a6a327b41f4091577cd959 WatchSource:0}: Error finding container 46a08305c674a126db468981d1924b51127c53db32a6a327b41f4091577cd959: Status 404 returned error can't find the container with id 46a08305c674a126db468981d1924b51127c53db32a6a327b41f4091577cd959 Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.043107 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbc9\" (UniqueName: \"kubernetes.io/projected/7d53aa13-0847-42e8-92f1-da4e51c714a7-kube-api-access-mxbc9\") pod \"root-account-create-update-bc7jm\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.047464 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-1559-account-create-update-562bx"] Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.496146 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.764277 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" event={"ID":"60202600-f7cc-4623-abf8-d3f1ad5662aa","Type":"ContainerStarted","Data":"46a08305c674a126db468981d1924b51127c53db32a6a327b41f4091577cd959"} Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.804560 4762 scope.go:117] "RemoveContainer" containerID="024be554b5bcd401984aa8441fff199e72202fb6c84a7f6704cf123d758aa475" Feb 17 14:28:25 crc kubenswrapper[4762]: I0217 14:28:25.844735 4762 scope.go:117] "RemoveContainer" containerID="7c85e290cfa1d8e2cd6a9ba2bf52a7b38e1e01ec5ec04fea8887436318293b33" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:26.913335 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64dd25ca-1eee-49de-9efd-611c90acb3e2" path="/var/lib/kubelet/pods/64dd25ca-1eee-49de-9efd-611c90acb3e2/volumes" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:26.914875 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b70f06-d85e-428d-87c1-1e9ab9ea991b" path="/var/lib/kubelet/pods/b7b70f06-d85e-428d-87c1-1e9ab9ea991b/volumes" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:26.915530 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bc7jm"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:26.915566 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" event={"ID":"366b755e-ebe1-4687-861b-39bb7892755a","Type":"ContainerStarted","Data":"69e0d25e32180c6841c0d805ed308ef91a5b22c4e5ac3a36b2161727223b1837"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.722445 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5qq4s"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.724362 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.737959 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkjjv\" (UniqueName: \"kubernetes.io/projected/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-kube-api-access-tkjjv\") pod \"glance-db-create-5qq4s\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.738221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-operator-scripts\") pod \"glance-db-create-5qq4s\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.743258 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5qq4s"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.840850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-operator-scripts\") pod \"glance-db-create-5qq4s\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.840997 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkjjv\" (UniqueName: \"kubernetes.io/projected/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-kube-api-access-tkjjv\") pod \"glance-db-create-5qq4s\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.841416 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4bb1-account-create-update-vtj6t"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.841760 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-operator-scripts\") pod \"glance-db-create-5qq4s\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.843232 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.847095 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.854308 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4bb1-account-create-update-vtj6t"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.856843 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7jm" event={"ID":"7d53aa13-0847-42e8-92f1-da4e51c714a7","Type":"ContainerStarted","Data":"ffa0682b9630e37ebaeb4bb355fef8eacbfab92142bc4c22ece878abd668ded5"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.856880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7jm" event={"ID":"7d53aa13-0847-42e8-92f1-da4e51c714a7","Type":"ContainerStarted","Data":"ec8ed1ae7d2eebab75ec763243fbd8ff3b0bf356fb4291ab7e5c21fd9353b150"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.858160 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7s7b5" event={"ID":"3c6069ca-94f7-439c-9434-0d79b4e56500","Type":"ContainerStarted","Data":"7736c2a9cfdf5c9129faecae9473056f8381898036675f22a7ad165700521fc4"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.861519 4762 generic.go:334] "Generic (PLEG): container finished" podID="366b755e-ebe1-4687-861b-39bb7892755a" containerID="69e0d25e32180c6841c0d805ed308ef91a5b22c4e5ac3a36b2161727223b1837" exitCode=0 Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.861583 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" event={"ID":"366b755e-ebe1-4687-861b-39bb7892755a","Type":"ContainerDied","Data":"69e0d25e32180c6841c0d805ed308ef91a5b22c4e5ac3a36b2161727223b1837"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.868150 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" event={"ID":"60202600-f7cc-4623-abf8-d3f1ad5662aa","Type":"ContainerStarted","Data":"33019fb54e609722ced569220097be6a3a2c7d1b6c067eae11eb22ac2b1cb78e"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.872476 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkjjv\" (UniqueName: \"kubernetes.io/projected/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-kube-api-access-tkjjv\") pod \"glance-db-create-5qq4s\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.889185 4762 generic.go:334] "Generic (PLEG): container finished" podID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerID="2dd0bfd50a92353c58b477696b8979a4f7277e4757894da2ea8addf23cf1ba42" exitCode=0 Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.889231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" event={"ID":"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e","Type":"ContainerDied","Data":"2dd0bfd50a92353c58b477696b8979a4f7277e4757894da2ea8addf23cf1ba42"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.896980 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-bc7jm" podStartSLOduration=3.896957232 podStartE2EDuration="3.896957232s" podCreationTimestamp="2026-02-17 14:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:27.885910742 +0000 UTC m=+1388.465911394" watchObservedRunningTime="2026-02-17 14:28:27.896957232 +0000 UTC m=+1388.476957884" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.918461 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" podStartSLOduration=4.918436185 podStartE2EDuration="4.918436185s" podCreationTimestamp="2026-02-17 14:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:27.907376725 +0000 UTC m=+1388.487377387" watchObservedRunningTime="2026-02-17 14:28:27.918436185 +0000 UTC m=+1388.498436837" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.942759 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c65095d-efc4-4480-b244-55169974d63d-operator-scripts\") pod \"glance-4bb1-account-create-update-vtj6t\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:27.942912 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvgjr\" (UniqueName: \"kubernetes.io/projected/9c65095d-efc4-4480-b244-55169974d63d-kube-api-access-tvgjr\") pod \"glance-4bb1-account-create-update-vtj6t\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.123042 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c65095d-efc4-4480-b244-55169974d63d-operator-scripts\") pod \"glance-4bb1-account-create-update-vtj6t\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.123157 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvgjr\" (UniqueName: \"kubernetes.io/projected/9c65095d-efc4-4480-b244-55169974d63d-kube-api-access-tvgjr\") pod \"glance-4bb1-account-create-update-vtj6t\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.123772 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.128581 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c65095d-efc4-4480-b244-55169974d63d-operator-scripts\") pod \"glance-4bb1-account-create-update-vtj6t\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.175328 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvgjr\" (UniqueName: \"kubernetes.io/projected/9c65095d-efc4-4480-b244-55169974d63d-kube-api-access-tvgjr\") pod \"glance-4bb1-account-create-update-vtj6t\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.233331 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zblds"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.236295 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.252301 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zblds"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.359478 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-400c-account-create-update-88mqh"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.360899 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.367522 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.389450 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-400c-account-create-update-88mqh"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.435937 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9f7t\" (UniqueName: \"kubernetes.io/projected/808ae239-be89-433d-ab1f-8807e658af8d-kube-api-access-d9f7t\") pod \"keystone-db-create-zblds\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.435996 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808ae239-be89-433d-ab1f-8807e658af8d-operator-scripts\") pod \"keystone-db-create-zblds\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.436086 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:31 crc kubenswrapper[4762]: E0217 14:28:28.436278 4762 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:28:31 crc kubenswrapper[4762]: E0217 14:28:28.436292 4762 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:28:31 crc kubenswrapper[4762]: E0217 14:28:28.436334 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift podName:466a7dc3-63d2-4995-ab6f-712df183303d nodeName:}" failed. No retries permitted until 2026-02-17 14:28:44.436319254 +0000 UTC m=+1405.016319906 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift") pod "swift-storage-0" (UID: "466a7dc3-63d2-4995-ab6f-712df183303d") : configmap "swift-ring-files" not found Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.474174 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.538775 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c69c000-54f6-4b64-a7fa-454fd519aad5-operator-scripts\") pod \"keystone-400c-account-create-update-88mqh\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.539008 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxphw\" (UniqueName: \"kubernetes.io/projected/8c69c000-54f6-4b64-a7fa-454fd519aad5-kube-api-access-dxphw\") pod \"keystone-400c-account-create-update-88mqh\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.539095 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9f7t\" (UniqueName: \"kubernetes.io/projected/808ae239-be89-433d-ab1f-8807e658af8d-kube-api-access-d9f7t\") pod \"keystone-db-create-zblds\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.539242 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808ae239-be89-433d-ab1f-8807e658af8d-operator-scripts\") pod \"keystone-db-create-zblds\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.540133 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808ae239-be89-433d-ab1f-8807e658af8d-operator-scripts\") pod \"keystone-db-create-zblds\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.656522 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c69c000-54f6-4b64-a7fa-454fd519aad5-operator-scripts\") pod \"keystone-400c-account-create-update-88mqh\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.656750 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxphw\" (UniqueName: \"kubernetes.io/projected/8c69c000-54f6-4b64-a7fa-454fd519aad5-kube-api-access-dxphw\") pod \"keystone-400c-account-create-update-88mqh\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.658669 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c69c000-54f6-4b64-a7fa-454fd519aad5-operator-scripts\") pod \"keystone-400c-account-create-update-88mqh\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.670779 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9f7t\" (UniqueName: \"kubernetes.io/projected/808ae239-be89-433d-ab1f-8807e658af8d-kube-api-access-d9f7t\") pod \"keystone-db-create-zblds\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.681415 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxphw\" (UniqueName: \"kubernetes.io/projected/8c69c000-54f6-4b64-a7fa-454fd519aad5-kube-api-access-dxphw\") pod \"keystone-400c-account-create-update-88mqh\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.698214 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.761666 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-njdl7"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.767824 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.791960 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-njdl7"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.834281 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a199-account-create-update-hxcrn"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.836770 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.838666 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.853288 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a199-account-create-update-hxcrn"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.862065 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zblds" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.867852 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtgl5\" (UniqueName: \"kubernetes.io/projected/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-kube-api-access-dtgl5\") pod \"placement-a199-account-create-update-hxcrn\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.868016 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-operator-scripts\") pod \"placement-db-create-njdl7\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.868053 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-operator-scripts\") pod \"placement-a199-account-create-update-hxcrn\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:28.868149 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvdm\" (UniqueName: \"kubernetes.io/projected/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-kube-api-access-tvvdm\") pod \"placement-db-create-njdl7\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.029078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-operator-scripts\") pod \"placement-db-create-njdl7\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.029131 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-operator-scripts\") pod \"placement-a199-account-create-update-hxcrn\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.029221 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvdm\" (UniqueName: \"kubernetes.io/projected/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-kube-api-access-tvvdm\") pod \"placement-db-create-njdl7\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.029307 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtgl5\" (UniqueName: \"kubernetes.io/projected/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-kube-api-access-dtgl5\") pod \"placement-a199-account-create-update-hxcrn\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.030821 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-operator-scripts\") pod \"placement-db-create-njdl7\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.030918 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-operator-scripts\") pod \"placement-a199-account-create-update-hxcrn\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.034870 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7s7b5" podStartSLOduration=12.034845804 podStartE2EDuration="12.034845804s" podCreationTimestamp="2026-02-17 14:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:28.926155581 +0000 UTC m=+1389.506156233" watchObservedRunningTime="2026-02-17 14:28:29.034845804 +0000 UTC m=+1389.614846456" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.056054 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvdm\" (UniqueName: \"kubernetes.io/projected/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-kube-api-access-tvvdm\") pod \"placement-db-create-njdl7\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.060495 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtgl5\" (UniqueName: \"kubernetes.io/projected/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-kube-api-access-dtgl5\") pod \"placement-a199-account-create-update-hxcrn\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.131550 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njdl7" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:29.163247 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.346655 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-77f76d465c-nhgvb" podUID="5d85da49-7985-429f-b4ed-d81ab921b28a" containerName="console" containerID="cri-o://c8fb48ad1878b5889f3ee2586929930c5c785db1918e85937bc99df92ef018b4" gracePeriod=15 Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.707925 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5mzzr"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.710043 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.734287 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5mzzr"] Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.788996 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11daea56-42b9-45b6-980a-c6afbe877c80-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-5mzzr\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.789208 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn9fk\" (UniqueName: \"kubernetes.io/projected/11daea56-42b9-45b6-980a-c6afbe877c80-kube-api-access-hn9fk\") pod \"mysqld-exporter-openstack-db-create-5mzzr\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: E0217 14:28:30.821995 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d53aa13_0847_42e8_92f1_da4e51c714a7.slice/crio-conmon-ffa0682b9630e37ebaeb4bb355fef8eacbfab92142bc4c22ece878abd668ded5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60202600_f7cc_4623_abf8_d3f1ad5662aa.slice/crio-33019fb54e609722ced569220097be6a3a2c7d1b6c067eae11eb22ac2b1cb78e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60202600_f7cc_4623_abf8_d3f1ad5662aa.slice/crio-conmon-33019fb54e609722ced569220097be6a3a2c7d1b6c067eae11eb22ac2b1cb78e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d53aa13_0847_42e8_92f1_da4e51c714a7.slice/crio-ffa0682b9630e37ebaeb4bb355fef8eacbfab92142bc4c22ece878abd668ded5.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.891256 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9fk\" (UniqueName: \"kubernetes.io/projected/11daea56-42b9-45b6-980a-c6afbe877c80-kube-api-access-hn9fk\") pod \"mysqld-exporter-openstack-db-create-5mzzr\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.891340 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11daea56-42b9-45b6-980a-c6afbe877c80-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-5mzzr\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.892374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11daea56-42b9-45b6-980a-c6afbe877c80-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-5mzzr\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.908810 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn9fk\" (UniqueName: \"kubernetes.io/projected/11daea56-42b9-45b6-980a-c6afbe877c80-kube-api-access-hn9fk\") pod \"mysqld-exporter-openstack-db-create-5mzzr\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.932333 4762 generic.go:334] "Generic (PLEG): container finished" podID="60202600-f7cc-4623-abf8-d3f1ad5662aa" containerID="33019fb54e609722ced569220097be6a3a2c7d1b6c067eae11eb22ac2b1cb78e" exitCode=0 Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.932386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" event={"ID":"60202600-f7cc-4623-abf8-d3f1ad5662aa","Type":"ContainerDied","Data":"33019fb54e609722ced569220097be6a3a2c7d1b6c067eae11eb22ac2b1cb78e"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.934512 4762 generic.go:334] "Generic (PLEG): container finished" podID="7d53aa13-0847-42e8-92f1-da4e51c714a7" containerID="ffa0682b9630e37ebaeb4bb355fef8eacbfab92142bc4c22ece878abd668ded5" exitCode=0 Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.934551 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7jm" event={"ID":"7d53aa13-0847-42e8-92f1-da4e51c714a7","Type":"ContainerDied","Data":"ffa0682b9630e37ebaeb4bb355fef8eacbfab92142bc4c22ece878abd668ded5"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.936476 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f76d465c-nhgvb_5d85da49-7985-429f-b4ed-d81ab921b28a/console/0.log" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.936499 4762 generic.go:334] "Generic (PLEG): container finished" podID="5d85da49-7985-429f-b4ed-d81ab921b28a" containerID="c8fb48ad1878b5889f3ee2586929930c5c785db1918e85937bc99df92ef018b4" exitCode=2 Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:30.936516 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f76d465c-nhgvb" event={"ID":"5d85da49-7985-429f-b4ed-d81ab921b28a","Type":"ContainerDied","Data":"c8fb48ad1878b5889f3ee2586929930c5c785db1918e85937bc99df92ef018b4"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:31.135504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:31.974663 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" event={"ID":"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e","Type":"ContainerStarted","Data":"246ffe15dbba94feb95110ee0a41781f663ada7a4abb43652d1fffebba205cb9"} Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:31.975187 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:31 crc kubenswrapper[4762]: I0217 14:28:31.998502 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" podStartSLOduration=14.998458234 podStartE2EDuration="14.998458234s" podCreationTimestamp="2026-02-17 14:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:31.994158298 +0000 UTC m=+1392.574158960" watchObservedRunningTime="2026-02-17 14:28:31.998458234 +0000 UTC m=+1392.578458886" Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.006496 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" event={"ID":"366b755e-ebe1-4687-861b-39bb7892755a","Type":"ContainerStarted","Data":"7f566a33f9382c001ceed3943d020ad43b69ea5c37d95501b57d60e015193888"} Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.038904 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" podStartSLOduration=14.038864522 podStartE2EDuration="14.038864522s" podCreationTimestamp="2026-02-17 14:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:32.033443155 +0000 UTC m=+1392.613443807" watchObservedRunningTime="2026-02-17 14:28:32.038864522 +0000 UTC m=+1392.618865174" Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.543832 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5qq4s"] Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.557007 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a199-account-create-update-hxcrn"] Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.566498 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4bb1-account-create-update-vtj6t"] Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.732147 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-400c-account-create-update-88mqh"] Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.740903 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-njdl7"] Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.833304 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5mzzr"] Feb 17 14:28:32 crc kubenswrapper[4762]: I0217 14:28:32.871381 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zblds"] Feb 17 14:28:33 crc kubenswrapper[4762]: W0217 14:28:33.006948 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f5362f_c5e9_4e05_8a7d_6071fa53c4ab.slice/crio-8502ceca65a1d2ebe9379fadc4a25163154c75704585eb78a6ade4d5dd407de8 WatchSource:0}: Error finding container 8502ceca65a1d2ebe9379fadc4a25163154c75704585eb78a6ade4d5dd407de8: Status 404 returned error can't find the container with id 8502ceca65a1d2ebe9379fadc4a25163154c75704585eb78a6ade4d5dd407de8 Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.027041 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f76d465c-nhgvb_5d85da49-7985-429f-b4ed-d81ab921b28a/console/0.log" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.027183 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77f76d465c-nhgvb" event={"ID":"5d85da49-7985-429f-b4ed-d81ab921b28a","Type":"ContainerDied","Data":"6f079a9d76ae9386818de75c547d45d1d76615870bd301de638e01b7863c2120"} Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.027243 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f079a9d76ae9386818de75c547d45d1d76615870bd301de638e01b7863c2120" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.027384 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:33 crc kubenswrapper[4762]: W0217 14:28:33.047904 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11daea56_42b9_45b6_980a_c6afbe877c80.slice/crio-eaf352dff9a078897cd602cbeedbd860c790bc0118c2f71ff2e4bee14a704b95 WatchSource:0}: Error finding container eaf352dff9a078897cd602cbeedbd860c790bc0118c2f71ff2e4bee14a704b95: Status 404 returned error can't find the container with id eaf352dff9a078897cd602cbeedbd860c790bc0118c2f71ff2e4bee14a704b95 Feb 17 14:28:33 crc kubenswrapper[4762]: W0217 14:28:33.054512 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod808ae239_be89_433d_ab1f_8807e658af8d.slice/crio-6a08a308dd842acf70a9d7898aa512499f274437ccf03b57ec4bf48cc6e722c9 WatchSource:0}: Error finding container 6a08a308dd842acf70a9d7898aa512499f274437ccf03b57ec4bf48cc6e722c9: Status 404 returned error can't find the container with id 6a08a308dd842acf70a9d7898aa512499f274437ccf03b57ec4bf48cc6e722c9 Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.399756 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77f76d465c-nhgvb_5d85da49-7985-429f-b4ed-d81ab921b28a/console/0.log" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.400414 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.493717 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-console-config\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.493918 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-trusted-ca-bundle\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.493969 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-oauth-config\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.494015 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgjf4\" (UniqueName: \"kubernetes.io/projected/5d85da49-7985-429f-b4ed-d81ab921b28a-kube-api-access-wgjf4\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.494056 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-oauth-serving-cert\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.494081 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-serving-cert\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.494160 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-service-ca\") pod \"5d85da49-7985-429f-b4ed-d81ab921b28a\" (UID: \"5d85da49-7985-429f-b4ed-d81ab921b28a\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.496031 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.496042 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-service-ca" (OuterVolumeSpecName: "service-ca") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.496476 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.496574 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-console-config" (OuterVolumeSpecName: "console-config") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.501779 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.501801 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d85da49-7985-429f-b4ed-d81ab921b28a-kube-api-access-wgjf4" (OuterVolumeSpecName: "kube-api-access-wgjf4") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "kube-api-access-wgjf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.504008 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5d85da49-7985-429f-b4ed-d81ab921b28a" (UID: "5d85da49-7985-429f-b4ed-d81ab921b28a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596728 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596755 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596766 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgjf4\" (UniqueName: \"kubernetes.io/projected/5d85da49-7985-429f-b4ed-d81ab921b28a-kube-api-access-wgjf4\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596775 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596783 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d85da49-7985-429f-b4ed-d81ab921b28a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596791 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.596799 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5d85da49-7985-429f-b4ed-d81ab921b28a-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.696379 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.753467 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.801762 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d53aa13-0847-42e8-92f1-da4e51c714a7-operator-scripts\") pod \"7d53aa13-0847-42e8-92f1-da4e51c714a7\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.801893 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60202600-f7cc-4623-abf8-d3f1ad5662aa-operator-scripts\") pod \"60202600-f7cc-4623-abf8-d3f1ad5662aa\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.802123 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxbc9\" (UniqueName: \"kubernetes.io/projected/7d53aa13-0847-42e8-92f1-da4e51c714a7-kube-api-access-mxbc9\") pod \"7d53aa13-0847-42e8-92f1-da4e51c714a7\" (UID: \"7d53aa13-0847-42e8-92f1-da4e51c714a7\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.802156 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztl8r\" (UniqueName: \"kubernetes.io/projected/60202600-f7cc-4623-abf8-d3f1ad5662aa-kube-api-access-ztl8r\") pod \"60202600-f7cc-4623-abf8-d3f1ad5662aa\" (UID: \"60202600-f7cc-4623-abf8-d3f1ad5662aa\") " Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.802522 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d53aa13-0847-42e8-92f1-da4e51c714a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d53aa13-0847-42e8-92f1-da4e51c714a7" (UID: "7d53aa13-0847-42e8-92f1-da4e51c714a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.803031 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60202600-f7cc-4623-abf8-d3f1ad5662aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60202600-f7cc-4623-abf8-d3f1ad5662aa" (UID: "60202600-f7cc-4623-abf8-d3f1ad5662aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.827157 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d53aa13-0847-42e8-92f1-da4e51c714a7-kube-api-access-mxbc9" (OuterVolumeSpecName: "kube-api-access-mxbc9") pod "7d53aa13-0847-42e8-92f1-da4e51c714a7" (UID: "7d53aa13-0847-42e8-92f1-da4e51c714a7"). InnerVolumeSpecName "kube-api-access-mxbc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.827220 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60202600-f7cc-4623-abf8-d3f1ad5662aa-kube-api-access-ztl8r" (OuterVolumeSpecName: "kube-api-access-ztl8r") pod "60202600-f7cc-4623-abf8-d3f1ad5662aa" (UID: "60202600-f7cc-4623-abf8-d3f1ad5662aa"). InnerVolumeSpecName "kube-api-access-ztl8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.904356 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxbc9\" (UniqueName: \"kubernetes.io/projected/7d53aa13-0847-42e8-92f1-da4e51c714a7-kube-api-access-mxbc9\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.905839 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztl8r\" (UniqueName: \"kubernetes.io/projected/60202600-f7cc-4623-abf8-d3f1ad5662aa-kube-api-access-ztl8r\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.905856 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d53aa13-0847-42e8-92f1-da4e51c714a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:33 crc kubenswrapper[4762]: I0217 14:28:33.905865 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60202600-f7cc-4623-abf8-d3f1ad5662aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.039551 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a199-account-create-update-hxcrn" event={"ID":"46085b5b-97db-43a2-9a40-b6fc4c6d4f60","Type":"ContainerStarted","Data":"785cbb491cbe5df25dbc9964a71629fcc710851a6d6098ddbc88a1fd90c4a699"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.039607 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a199-account-create-update-hxcrn" event={"ID":"46085b5b-97db-43a2-9a40-b6fc4c6d4f60","Type":"ContainerStarted","Data":"afc3204d91e5de54846e5a291732ee90796fde5aa10da2347d89c07f06f632d1"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.042567 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4bb1-account-create-update-vtj6t" event={"ID":"9c65095d-efc4-4480-b244-55169974d63d","Type":"ContainerStarted","Data":"e78f423ef5b9833e47c7d8dc53eaeeb83fee497be745e0ddaccd591008b6d099"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.042665 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4bb1-account-create-update-vtj6t" event={"ID":"9c65095d-efc4-4480-b244-55169974d63d","Type":"ContainerStarted","Data":"288f9f76b21c7a67624652d66eddbdc3c6c1322f507e3c32812aea964a8d75d3"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.045820 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-400c-account-create-update-88mqh" event={"ID":"8c69c000-54f6-4b64-a7fa-454fd519aad5","Type":"ContainerStarted","Data":"228fb8a43a6cd143d797a569a730b494dc088b00a3f6bd259e1c0e21a9f7450b"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.045867 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-400c-account-create-update-88mqh" event={"ID":"8c69c000-54f6-4b64-a7fa-454fd519aad5","Type":"ContainerStarted","Data":"289e21d46dfa36e4ba8daee4fbfe0de4de42b5772e499f22fd86275d3060d62e"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.054130 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" event={"ID":"60202600-f7cc-4623-abf8-d3f1ad5662aa","Type":"ContainerDied","Data":"46a08305c674a126db468981d1924b51127c53db32a6a327b41f4091577cd959"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.054194 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46a08305c674a126db468981d1924b51127c53db32a6a327b41f4091577cd959" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.054189 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-1559-account-create-update-562bx" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.059119 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7jm" event={"ID":"7d53aa13-0847-42e8-92f1-da4e51c714a7","Type":"ContainerDied","Data":"ec8ed1ae7d2eebab75ec763243fbd8ff3b0bf356fb4291ab7e5c21fd9353b150"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.059169 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec8ed1ae7d2eebab75ec763243fbd8ff3b0bf356fb4291ab7e5c21fd9353b150" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.059259 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7jm" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.082191 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-a199-account-create-update-hxcrn" podStartSLOduration=6.082173461 podStartE2EDuration="6.082173461s" podCreationTimestamp="2026-02-17 14:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.069046334 +0000 UTC m=+1394.649046986" watchObservedRunningTime="2026-02-17 14:28:34.082173461 +0000 UTC m=+1394.662174113" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.084164 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-4bb1-account-create-update-vtj6t" podStartSLOduration=7.084154835 podStartE2EDuration="7.084154835s" podCreationTimestamp="2026-02-17 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.082790698 +0000 UTC m=+1394.662791340" watchObservedRunningTime="2026-02-17 14:28:34.084154835 +0000 UTC m=+1394.664155487" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.088893 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77f76d465c-nhgvb" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.113862 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-400c-account-create-update-88mqh" podStartSLOduration=6.113847801 podStartE2EDuration="6.113847801s" podCreationTimestamp="2026-02-17 14:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.106470591 +0000 UTC m=+1394.686471253" watchObservedRunningTime="2026-02-17 14:28:34.113847801 +0000 UTC m=+1394.693848453" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114862 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" event={"ID":"11daea56-42b9-45b6-980a-c6afbe877c80","Type":"ContainerStarted","Data":"799f0be8de6774ac888492558e975cbeba5b8650dabba95c8964353f2b8866b6"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114917 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" event={"ID":"11daea56-42b9-45b6-980a-c6afbe877c80","Type":"ContainerStarted","Data":"eaf352dff9a078897cd602cbeedbd860c790bc0118c2f71ff2e4bee14a704b95"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njdl7" event={"ID":"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25","Type":"ContainerStarted","Data":"aa92c3b100e57f65921e0e3059e1b58d730bba3b1aa114fbd82fb24afede67a2"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114956 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njdl7" event={"ID":"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25","Type":"ContainerStarted","Data":"e168955d73742201234d7d10d6b372d54a8bc1c545af7735582b7ccab4ba226c"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zblds" event={"ID":"808ae239-be89-433d-ab1f-8807e658af8d","Type":"ContainerStarted","Data":"d71554e5eab2f9324767fa0ce932a2d26c3a6a4bd329fc5dd75e3dde4406cefa"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114980 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zblds" event={"ID":"808ae239-be89-433d-ab1f-8807e658af8d","Type":"ContainerStarted","Data":"6a08a308dd842acf70a9d7898aa512499f274437ccf03b57ec4bf48cc6e722c9"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.114992 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5qq4s" event={"ID":"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab","Type":"ContainerStarted","Data":"6b585fc1d7e508864bf3c545229786358225e1d6cca453ad147dcb0c79b40189"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.115005 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5qq4s" event={"ID":"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab","Type":"ContainerStarted","Data":"8502ceca65a1d2ebe9379fadc4a25163154c75704585eb78a6ade4d5dd407de8"} Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.138676 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" podStartSLOduration=4.138639585 podStartE2EDuration="4.138639585s" podCreationTimestamp="2026-02-17 14:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.131025168 +0000 UTC m=+1394.711025830" watchObservedRunningTime="2026-02-17 14:28:34.138639585 +0000 UTC m=+1394.718640237" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.173334 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-5qq4s" podStartSLOduration=7.173316927 podStartE2EDuration="7.173316927s" podCreationTimestamp="2026-02-17 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.147488385 +0000 UTC m=+1394.727489037" watchObservedRunningTime="2026-02-17 14:28:34.173316927 +0000 UTC m=+1394.753317579" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.192662 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-njdl7" podStartSLOduration=6.192629332 podStartE2EDuration="6.192629332s" podCreationTimestamp="2026-02-17 14:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.167315214 +0000 UTC m=+1394.747315886" watchObservedRunningTime="2026-02-17 14:28:34.192629332 +0000 UTC m=+1394.772629984" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.209786 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-zblds" podStartSLOduration=6.209764437 podStartE2EDuration="6.209764437s" podCreationTimestamp="2026-02-17 14:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:34.180952834 +0000 UTC m=+1394.760953496" watchObservedRunningTime="2026-02-17 14:28:34.209764437 +0000 UTC m=+1394.789765089" Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.229191 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77f76d465c-nhgvb"] Feb 17 14:28:34 crc kubenswrapper[4762]: I0217 14:28:34.245159 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77f76d465c-nhgvb"] Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.100410 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" containerID="6b585fc1d7e508864bf3c545229786358225e1d6cca453ad147dcb0c79b40189" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.100514 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5qq4s" event={"ID":"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab","Type":"ContainerDied","Data":"6b585fc1d7e508864bf3c545229786358225e1d6cca453ad147dcb0c79b40189"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.106132 4762 generic.go:334] "Generic (PLEG): container finished" podID="46085b5b-97db-43a2-9a40-b6fc4c6d4f60" containerID="785cbb491cbe5df25dbc9964a71629fcc710851a6d6098ddbc88a1fd90c4a699" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.108158 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a199-account-create-update-hxcrn" event={"ID":"46085b5b-97db-43a2-9a40-b6fc4c6d4f60","Type":"ContainerDied","Data":"785cbb491cbe5df25dbc9964a71629fcc710851a6d6098ddbc88a1fd90c4a699"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.112329 4762 generic.go:334] "Generic (PLEG): container finished" podID="8c69c000-54f6-4b64-a7fa-454fd519aad5" containerID="228fb8a43a6cd143d797a569a730b494dc088b00a3f6bd259e1c0e21a9f7450b" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.112434 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-400c-account-create-update-88mqh" event={"ID":"8c69c000-54f6-4b64-a7fa-454fd519aad5","Type":"ContainerDied","Data":"228fb8a43a6cd143d797a569a730b494dc088b00a3f6bd259e1c0e21a9f7450b"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.124134 4762 generic.go:334] "Generic (PLEG): container finished" podID="d23bccd7-14f7-419d-95db-38470afb02b0" containerID="472881f2fea3d4c190c7a71d3688c49816c3b38f082a33ad3a8d0a2b42a985cc" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.124210 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d23bccd7-14f7-419d-95db-38470afb02b0","Type":"ContainerDied","Data":"472881f2fea3d4c190c7a71d3688c49816c3b38f082a33ad3a8d0a2b42a985cc"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.127259 4762 generic.go:334] "Generic (PLEG): container finished" podID="12862d08-7816-4a6d-9a52-aceeae5e1d8e" containerID="b11db3113125fb889927cf674d2bbcd1aa7731c1f11642c52f42397ac3ed0e4d" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.127325 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"12862d08-7816-4a6d-9a52-aceeae5e1d8e","Type":"ContainerDied","Data":"b11db3113125fb889927cf674d2bbcd1aa7731c1f11642c52f42397ac3ed0e4d"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.129751 4762 generic.go:334] "Generic (PLEG): container finished" podID="3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" containerID="aa92c3b100e57f65921e0e3059e1b58d730bba3b1aa114fbd82fb24afede67a2" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.129963 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njdl7" event={"ID":"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25","Type":"ContainerDied","Data":"aa92c3b100e57f65921e0e3059e1b58d730bba3b1aa114fbd82fb24afede67a2"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.138566 4762 generic.go:334] "Generic (PLEG): container finished" podID="6c34ffbd-b33d-4579-8a4d-a51ef852b1a1" containerID="871f822e9905255baedc928635c7f6e04ebc6715f1e03baf39953b705867f569" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.138627 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1","Type":"ContainerDied","Data":"871f822e9905255baedc928635c7f6e04ebc6715f1e03baf39953b705867f569"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.142954 4762 generic.go:334] "Generic (PLEG): container finished" podID="9c65095d-efc4-4480-b244-55169974d63d" containerID="e78f423ef5b9833e47c7d8dc53eaeeb83fee497be745e0ddaccd591008b6d099" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.143068 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4bb1-account-create-update-vtj6t" event={"ID":"9c65095d-efc4-4480-b244-55169974d63d","Type":"ContainerDied","Data":"e78f423ef5b9833e47c7d8dc53eaeeb83fee497be745e0ddaccd591008b6d099"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.157173 4762 generic.go:334] "Generic (PLEG): container finished" podID="11daea56-42b9-45b6-980a-c6afbe877c80" containerID="799f0be8de6774ac888492558e975cbeba5b8650dabba95c8964353f2b8866b6" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.157288 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" event={"ID":"11daea56-42b9-45b6-980a-c6afbe877c80","Type":"ContainerDied","Data":"799f0be8de6774ac888492558e975cbeba5b8650dabba95c8964353f2b8866b6"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.160390 4762 generic.go:334] "Generic (PLEG): container finished" podID="808ae239-be89-433d-ab1f-8807e658af8d" containerID="d71554e5eab2f9324767fa0ce932a2d26c3a6a4bd329fc5dd75e3dde4406cefa" exitCode=0 Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.160443 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zblds" event={"ID":"808ae239-be89-433d-ab1f-8807e658af8d","Type":"ContainerDied","Data":"d71554e5eab2f9324767fa0ce932a2d26c3a6a4bd329fc5dd75e3dde4406cefa"} Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.829144 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bc7jm"] Feb 17 14:28:35 crc kubenswrapper[4762]: I0217 14:28:35.837210 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bc7jm"] Feb 17 14:28:36 crc kubenswrapper[4762]: I0217 14:28:36.169861 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d85da49-7985-429f-b4ed-d81ab921b28a" path="/var/lib/kubelet/pods/5d85da49-7985-429f-b4ed-d81ab921b28a/volumes" Feb 17 14:28:36 crc kubenswrapper[4762]: I0217 14:28:36.170595 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d53aa13-0847-42e8-92f1-da4e51c714a7" path="/var/lib/kubelet/pods/7d53aa13-0847-42e8-92f1-da4e51c714a7/volumes" Feb 17 14:28:36 crc kubenswrapper[4762]: I0217 14:28:36.178503 4762 generic.go:334] "Generic (PLEG): container finished" podID="f6083b27-9cd4-494a-8b51-9dff95918001" containerID="b2eb1cacf9d0f15de18d722a7a6403b43eac80b656a1dc2f813ca4ccea1f3ded" exitCode=0 Feb 17 14:28:36 crc kubenswrapper[4762]: I0217 14:28:36.178610 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-674vl" event={"ID":"f6083b27-9cd4-494a-8b51-9dff95918001","Type":"ContainerDied","Data":"b2eb1cacf9d0f15de18d722a7a6403b43eac80b656a1dc2f813ca4ccea1f3ded"} Feb 17 14:28:36 crc kubenswrapper[4762]: I0217 14:28:36.182433 4762 generic.go:334] "Generic (PLEG): container finished" podID="391886d8-341f-4e66-980c-00f6cd881e10" containerID="dc49693b749ed728999eb0a6e332ef87ee14582e4d7a57b7a32aec2d07dd0888" exitCode=0 Feb 17 14:28:36 crc kubenswrapper[4762]: I0217 14:28:36.182618 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"391886d8-341f-4e66-980c-00f6cd881e10","Type":"ContainerDied","Data":"dc49693b749ed728999eb0a6e332ef87ee14582e4d7a57b7a32aec2d07dd0888"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.123713 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.144736 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.202387 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.226419 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.230945 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4bb1-account-create-update-vtj6t" event={"ID":"9c65095d-efc4-4480-b244-55169974d63d","Type":"ContainerDied","Data":"288f9f76b21c7a67624652d66eddbdc3c6c1322f507e3c32812aea964a8d75d3"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.230987 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288f9f76b21c7a67624652d66eddbdc3c6c1322f507e3c32812aea964a8d75d3" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.231056 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4bb1-account-create-update-vtj6t" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.239052 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-674vl" event={"ID":"f6083b27-9cd4-494a-8b51-9dff95918001","Type":"ContainerDied","Data":"f30206ad5ce38da61bd96c1041ac042820038fa596c06cde0eed4a4875393d92"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.239088 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f30206ad5ce38da61bd96c1041ac042820038fa596c06cde0eed4a4875393d92" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.242639 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-400c-account-create-update-88mqh" event={"ID":"8c69c000-54f6-4b64-a7fa-454fd519aad5","Type":"ContainerDied","Data":"289e21d46dfa36e4ba8daee4fbfe0de4de42b5772e499f22fd86275d3060d62e"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.242688 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289e21d46dfa36e4ba8daee4fbfe0de4de42b5772e499f22fd86275d3060d62e" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.246334 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" event={"ID":"11daea56-42b9-45b6-980a-c6afbe877c80","Type":"ContainerDied","Data":"eaf352dff9a078897cd602cbeedbd860c790bc0118c2f71ff2e4bee14a704b95"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.246357 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf352dff9a078897cd602cbeedbd860c790bc0118c2f71ff2e4bee14a704b95" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.246367 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-5mzzr" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.254188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zblds" event={"ID":"808ae239-be89-433d-ab1f-8807e658af8d","Type":"ContainerDied","Data":"6a08a308dd842acf70a9d7898aa512499f274437ccf03b57ec4bf48cc6e722c9"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.254234 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a08a308dd842acf70a9d7898aa512499f274437ccf03b57ec4bf48cc6e722c9" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.257354 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.258606 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njdl7" event={"ID":"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25","Type":"ContainerDied","Data":"e168955d73742201234d7d10d6b372d54a8bc1c545af7735582b7ccab4ba226c"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.258655 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e168955d73742201234d7d10d6b372d54a8bc1c545af7735582b7ccab4ba226c" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.264794 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5qq4s" event={"ID":"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab","Type":"ContainerDied","Data":"8502ceca65a1d2ebe9379fadc4a25163154c75704585eb78a6ade4d5dd407de8"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.265060 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8502ceca65a1d2ebe9379fadc4a25163154c75704585eb78a6ade4d5dd407de8" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.264834 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5qq4s" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.272860 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zblds" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.273942 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a199-account-create-update-hxcrn" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.273942 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a199-account-create-update-hxcrn" event={"ID":"46085b5b-97db-43a2-9a40-b6fc4c6d4f60","Type":"ContainerDied","Data":"afc3204d91e5de54846e5a291732ee90796fde5aa10da2347d89c07f06f632d1"} Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.274437 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc3204d91e5de54846e5a291732ee90796fde5aa10da2347d89c07f06f632d1" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.277507 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njdl7" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.287873 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c65095d-efc4-4480-b244-55169974d63d-operator-scripts\") pod \"9c65095d-efc4-4480-b244-55169974d63d\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.287927 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-operator-scripts\") pod \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.288245 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvgjr\" (UniqueName: \"kubernetes.io/projected/9c65095d-efc4-4480-b244-55169974d63d-kube-api-access-tvgjr\") pod \"9c65095d-efc4-4480-b244-55169974d63d\" (UID: \"9c65095d-efc4-4480-b244-55169974d63d\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.288303 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtgl5\" (UniqueName: \"kubernetes.io/projected/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-kube-api-access-dtgl5\") pod \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\" (UID: \"46085b5b-97db-43a2-9a40-b6fc4c6d4f60\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.288414 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c65095d-efc4-4480-b244-55169974d63d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c65095d-efc4-4480-b244-55169974d63d" (UID: "9c65095d-efc4-4480-b244-55169974d63d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.289700 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46085b5b-97db-43a2-9a40-b6fc4c6d4f60" (UID: "46085b5b-97db-43a2-9a40-b6fc4c6d4f60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.290577 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.290604 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c65095d-efc4-4480-b244-55169974d63d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.297252 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-kube-api-access-dtgl5" (OuterVolumeSpecName: "kube-api-access-dtgl5") pod "46085b5b-97db-43a2-9a40-b6fc4c6d4f60" (UID: "46085b5b-97db-43a2-9a40-b6fc4c6d4f60"). InnerVolumeSpecName "kube-api-access-dtgl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.298603 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c65095d-efc4-4480-b244-55169974d63d-kube-api-access-tvgjr" (OuterVolumeSpecName: "kube-api-access-tvgjr") pod "9c65095d-efc4-4480-b244-55169974d63d" (UID: "9c65095d-efc4-4480-b244-55169974d63d"). InnerVolumeSpecName "kube-api-access-tvgjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.305109 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.394882 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvvdm\" (UniqueName: \"kubernetes.io/projected/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-kube-api-access-tvvdm\") pod \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.395347 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c69c000-54f6-4b64-a7fa-454fd519aad5-operator-scripts\") pod \"8c69c000-54f6-4b64-a7fa-454fd519aad5\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.395462 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11daea56-42b9-45b6-980a-c6afbe877c80-operator-scripts\") pod \"11daea56-42b9-45b6-980a-c6afbe877c80\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.395552 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxphw\" (UniqueName: \"kubernetes.io/projected/8c69c000-54f6-4b64-a7fa-454fd519aad5-kube-api-access-dxphw\") pod \"8c69c000-54f6-4b64-a7fa-454fd519aad5\" (UID: \"8c69c000-54f6-4b64-a7fa-454fd519aad5\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.395716 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-scripts\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.395822 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkjjv\" (UniqueName: \"kubernetes.io/projected/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-kube-api-access-tkjjv\") pod \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.395946 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn9fk\" (UniqueName: \"kubernetes.io/projected/11daea56-42b9-45b6-980a-c6afbe877c80-kube-api-access-hn9fk\") pod \"11daea56-42b9-45b6-980a-c6afbe877c80\" (UID: \"11daea56-42b9-45b6-980a-c6afbe877c80\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396044 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-swiftconf\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396130 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-operator-scripts\") pod \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\" (UID: \"d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396231 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-combined-ca-bundle\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396320 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-operator-scripts\") pod \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\" (UID: \"3cb9fb92-bfd5-48fc-8d6f-1b616a958e25\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396387 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjncg\" (UniqueName: \"kubernetes.io/projected/f6083b27-9cd4-494a-8b51-9dff95918001-kube-api-access-zjncg\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396463 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808ae239-be89-433d-ab1f-8807e658af8d-operator-scripts\") pod \"808ae239-be89-433d-ab1f-8807e658af8d\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396531 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-ring-data-devices\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396627 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6083b27-9cd4-494a-8b51-9dff95918001-etc-swift\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396741 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9f7t\" (UniqueName: \"kubernetes.io/projected/808ae239-be89-433d-ab1f-8807e658af8d-kube-api-access-d9f7t\") pod \"808ae239-be89-433d-ab1f-8807e658af8d\" (UID: \"808ae239-be89-433d-ab1f-8807e658af8d\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.396823 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-dispersionconf\") pod \"f6083b27-9cd4-494a-8b51-9dff95918001\" (UID: \"f6083b27-9cd4-494a-8b51-9dff95918001\") " Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.398013 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvgjr\" (UniqueName: \"kubernetes.io/projected/9c65095d-efc4-4480-b244-55169974d63d-kube-api-access-tvgjr\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.398091 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtgl5\" (UniqueName: \"kubernetes.io/projected/46085b5b-97db-43a2-9a40-b6fc4c6d4f60-kube-api-access-dtgl5\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.404463 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11daea56-42b9-45b6-980a-c6afbe877c80-kube-api-access-hn9fk" (OuterVolumeSpecName: "kube-api-access-hn9fk") pod "11daea56-42b9-45b6-980a-c6afbe877c80" (UID: "11daea56-42b9-45b6-980a-c6afbe877c80"). InnerVolumeSpecName "kube-api-access-hn9fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.407864 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" (UID: "d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.407963 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.408359 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/808ae239-be89-433d-ab1f-8807e658af8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "808ae239-be89-433d-ab1f-8807e658af8d" (UID: "808ae239-be89-433d-ab1f-8807e658af8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.408448 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c69c000-54f6-4b64-a7fa-454fd519aad5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c69c000-54f6-4b64-a7fa-454fd519aad5" (UID: "8c69c000-54f6-4b64-a7fa-454fd519aad5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.409267 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6083b27-9cd4-494a-8b51-9dff95918001-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.409498 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11daea56-42b9-45b6-980a-c6afbe877c80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11daea56-42b9-45b6-980a-c6afbe877c80" (UID: "11daea56-42b9-45b6-980a-c6afbe877c80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.409568 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" (UID: "3cb9fb92-bfd5-48fc-8d6f-1b616a958e25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.409904 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-kube-api-access-tvvdm" (OuterVolumeSpecName: "kube-api-access-tvvdm") pod "3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" (UID: "3cb9fb92-bfd5-48fc-8d6f-1b616a958e25"). InnerVolumeSpecName "kube-api-access-tvvdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.414908 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-kube-api-access-tkjjv" (OuterVolumeSpecName: "kube-api-access-tkjjv") pod "d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" (UID: "d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab"). InnerVolumeSpecName "kube-api-access-tkjjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.415417 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c69c000-54f6-4b64-a7fa-454fd519aad5-kube-api-access-dxphw" (OuterVolumeSpecName: "kube-api-access-dxphw") pod "8c69c000-54f6-4b64-a7fa-454fd519aad5" (UID: "8c69c000-54f6-4b64-a7fa-454fd519aad5"). InnerVolumeSpecName "kube-api-access-dxphw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.425976 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808ae239-be89-433d-ab1f-8807e658af8d-kube-api-access-d9f7t" (OuterVolumeSpecName: "kube-api-access-d9f7t") pod "808ae239-be89-433d-ab1f-8807e658af8d" (UID: "808ae239-be89-433d-ab1f-8807e658af8d"). InnerVolumeSpecName "kube-api-access-d9f7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.436833 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6083b27-9cd4-494a-8b51-9dff95918001-kube-api-access-zjncg" (OuterVolumeSpecName: "kube-api-access-zjncg") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "kube-api-access-zjncg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.464869 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.465434 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-scripts" (OuterVolumeSpecName: "scripts") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.474899 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.499372 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.499397 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjncg\" (UniqueName: \"kubernetes.io/projected/f6083b27-9cd4-494a-8b51-9dff95918001-kube-api-access-zjncg\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.499407 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808ae239-be89-433d-ab1f-8807e658af8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.499417 4762 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.499427 4762 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6083b27-9cd4-494a-8b51-9dff95918001-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.499436 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9f7t\" (UniqueName: \"kubernetes.io/projected/808ae239-be89-433d-ab1f-8807e658af8d-kube-api-access-d9f7t\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500123 4762 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500152 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvvdm\" (UniqueName: \"kubernetes.io/projected/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25-kube-api-access-tvvdm\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500172 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c69c000-54f6-4b64-a7fa-454fd519aad5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500182 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11daea56-42b9-45b6-980a-c6afbe877c80-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500191 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxphw\" (UniqueName: \"kubernetes.io/projected/8c69c000-54f6-4b64-a7fa-454fd519aad5-kube-api-access-dxphw\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500201 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6083b27-9cd4-494a-8b51-9dff95918001-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500210 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkjjv\" (UniqueName: \"kubernetes.io/projected/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-kube-api-access-tkjjv\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500221 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn9fk\" (UniqueName: \"kubernetes.io/projected/11daea56-42b9-45b6-980a-c6afbe877c80-kube-api-access-hn9fk\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500231 4762 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.500243 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.528509 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6083b27-9cd4-494a-8b51-9dff95918001" (UID: "f6083b27-9cd4-494a-8b51-9dff95918001"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:38 crc kubenswrapper[4762]: I0217 14:28:38.601777 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6083b27-9cd4-494a-8b51-9dff95918001-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.289941 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d23bccd7-14f7-419d-95db-38470afb02b0","Type":"ContainerStarted","Data":"f29c36abbe0a16f4a85436383638f4732114374f4c24932eaaec1301d57b34cf"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.291587 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.295067 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"35249c1a-ea4f-419c-91be-dfee3dbf3303","Type":"ContainerStarted","Data":"9a6fb09534d10d2a7b41db7daa90d7204eff702ae0d67e56d69e0bdcb34be862"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.295116 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"35249c1a-ea4f-419c-91be-dfee3dbf3303","Type":"ContainerStarted","Data":"fb2a01442f68a3d3d2419b45446c9e3c56e75ed0a730156d59dc67836e395c64"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.296525 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.309553 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"12862d08-7816-4a6d-9a52-aceeae5e1d8e","Type":"ContainerStarted","Data":"4bbb648d0b26be75859a12703683b88c54e0ffe74033d6fd1e7e15aa8884a872"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.312935 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.319046 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"391886d8-341f-4e66-980c-00f6cd881e10","Type":"ContainerStarted","Data":"95eb23122feb3fb0347f16fcb75637cfb505e8c96b7bcc2ac70b34bc6a0290fe"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.319538 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.325553 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerStarted","Data":"3cd041b3d46bc24d231294c9e613858fe5c95b7ae71f17e4af6727b51ee49c66"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.328953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c34ffbd-b33d-4579-8a4d-a51ef852b1a1","Type":"ContainerStarted","Data":"51242b649096be346f5ec5bdb2368ab598938137462fef113db0a6d6819bda69"} Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.329030 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-400c-account-create-update-88mqh" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.329317 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zblds" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.336980 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njdl7" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.337759 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.337831 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-674vl" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.339783 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z5pp2"] Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.340402 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d85da49-7985-429f-b4ed-d81ab921b28a" containerName="console" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.340480 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d85da49-7985-429f-b4ed-d81ab921b28a" containerName="console" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.340535 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.340582 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.340635 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46085b5b-97db-43a2-9a40-b6fc4c6d4f60" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.340702 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="46085b5b-97db-43a2-9a40-b6fc4c6d4f60" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.340758 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808ae239-be89-433d-ab1f-8807e658af8d" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.340815 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="808ae239-be89-433d-ab1f-8807e658af8d" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.340868 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11daea56-42b9-45b6-980a-c6afbe877c80" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.340914 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="11daea56-42b9-45b6-980a-c6afbe877c80" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.340963 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c65095d-efc4-4480-b244-55169974d63d" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.341007 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c65095d-efc4-4480-b244-55169974d63d" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.341058 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60202600-f7cc-4623-abf8-d3f1ad5662aa" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.341108 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="60202600-f7cc-4623-abf8-d3f1ad5662aa" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.341165 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6083b27-9cd4-494a-8b51-9dff95918001" containerName="swift-ring-rebalance" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.341238 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6083b27-9cd4-494a-8b51-9dff95918001" containerName="swift-ring-rebalance" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.341293 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.341366 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.341454 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d53aa13-0847-42e8-92f1-da4e51c714a7" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.341591 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d53aa13-0847-42e8-92f1-da4e51c714a7" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: E0217 14:28:39.341693 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c69c000-54f6-4b64-a7fa-454fd519aad5" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.341750 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c69c000-54f6-4b64-a7fa-454fd519aad5" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342024 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="60202600-f7cc-4623-abf8-d3f1ad5662aa" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342095 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="11daea56-42b9-45b6-980a-c6afbe877c80" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342147 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342210 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d85da49-7985-429f-b4ed-d81ab921b28a" containerName="console" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342262 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6083b27-9cd4-494a-8b51-9dff95918001" containerName="swift-ring-rebalance" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342350 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c69c000-54f6-4b64-a7fa-454fd519aad5" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342419 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="46085b5b-97db-43a2-9a40-b6fc4c6d4f60" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342517 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="808ae239-be89-433d-ab1f-8807e658af8d" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342599 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d53aa13-0847-42e8-92f1-da4e51c714a7" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342691 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c65095d-efc4-4480-b244-55169974d63d" containerName="mariadb-account-create-update" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.342783 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" containerName="mariadb-database-create" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.343634 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.347232 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.370823 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z5pp2"] Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.387888 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=41.747074442 podStartE2EDuration="1m7.387862627s" podCreationTimestamp="2026-02-17 14:27:32 +0000 UTC" firstStartedPulling="2026-02-17 14:27:35.355738332 +0000 UTC m=+1335.935738984" lastFinishedPulling="2026-02-17 14:28:00.996526507 +0000 UTC m=+1361.576527169" observedRunningTime="2026-02-17 14:28:39.340216183 +0000 UTC m=+1399.920216855" watchObservedRunningTime="2026-02-17 14:28:39.387862627 +0000 UTC m=+1399.967863279" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.424708 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=41.850288436 podStartE2EDuration="1m7.424683368s" podCreationTimestamp="2026-02-17 14:27:32 +0000 UTC" firstStartedPulling="2026-02-17 14:27:35.421034665 +0000 UTC m=+1336.001035307" lastFinishedPulling="2026-02-17 14:28:00.995429587 +0000 UTC m=+1361.575430239" observedRunningTime="2026-02-17 14:28:39.402872505 +0000 UTC m=+1399.982873157" watchObservedRunningTime="2026-02-17 14:28:39.424683368 +0000 UTC m=+1400.004684020" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.457825 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.979754724 podStartE2EDuration="1m6.457802828s" podCreationTimestamp="2026-02-17 14:27:33 +0000 UTC" firstStartedPulling="2026-02-17 14:27:35.580779975 +0000 UTC m=+1336.160780627" lastFinishedPulling="2026-02-17 14:28:01.058828079 +0000 UTC m=+1361.638828731" observedRunningTime="2026-02-17 14:28:39.444154257 +0000 UTC m=+1400.024154909" watchObservedRunningTime="2026-02-17 14:28:39.457802828 +0000 UTC m=+1400.037803480" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.509823 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.663478 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-operator-scripts\") pod \"root-account-create-update-z5pp2\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.663608 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckzr\" (UniqueName: \"kubernetes.io/projected/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-kube-api-access-zckzr\") pod \"root-account-create-update-z5pp2\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.668825 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=12.717953848 podStartE2EDuration="21.668784849s" podCreationTimestamp="2026-02-17 14:28:18 +0000 UTC" firstStartedPulling="2026-02-17 14:28:24.167373553 +0000 UTC m=+1384.747374205" lastFinishedPulling="2026-02-17 14:28:33.118204554 +0000 UTC m=+1393.698205206" observedRunningTime="2026-02-17 14:28:39.464046267 +0000 UTC m=+1400.044046919" watchObservedRunningTime="2026-02-17 14:28:39.668784849 +0000 UTC m=+1400.248785501" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.724600 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.974139861 podStartE2EDuration="1m7.724575115s" podCreationTimestamp="2026-02-17 14:27:32 +0000 UTC" firstStartedPulling="2026-02-17 14:27:34.899686172 +0000 UTC m=+1335.479686824" lastFinishedPulling="2026-02-17 14:28:00.650121406 +0000 UTC m=+1361.230122078" observedRunningTime="2026-02-17 14:28:39.689221834 +0000 UTC m=+1400.269222486" watchObservedRunningTime="2026-02-17 14:28:39.724575115 +0000 UTC m=+1400.304575777" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.765838 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-operator-scripts\") pod \"root-account-create-update-z5pp2\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.765898 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckzr\" (UniqueName: \"kubernetes.io/projected/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-kube-api-access-zckzr\") pod \"root-account-create-update-z5pp2\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.767368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-operator-scripts\") pod \"root-account-create-update-z5pp2\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.805570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckzr\" (UniqueName: \"kubernetes.io/projected/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-kube-api-access-zckzr\") pod \"root-account-create-update-z5pp2\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.983753 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:28:39 crc kubenswrapper[4762]: I0217 14:28:39.994709 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:40 crc kubenswrapper[4762]: I0217 14:28:40.069633 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-2jm8z"] Feb 17 14:28:40 crc kubenswrapper[4762]: I0217 14:28:40.347437 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerName="dnsmasq-dns" containerID="cri-o://246ffe15dbba94feb95110ee0a41781f663ada7a4abb43652d1fffebba205cb9" gracePeriod=10 Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.259253 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z5pp2"] Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.375931 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z5pp2" event={"ID":"cdc0cab3-27e5-462f-8b21-e97775f8f4b4","Type":"ContainerStarted","Data":"70fd84a61ab82837aed2f362e1ba4323eaf0e542568e4fa4f6f44958ea3ddb40"} Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.389887 4762 generic.go:334] "Generic (PLEG): container finished" podID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerID="246ffe15dbba94feb95110ee0a41781f663ada7a4abb43652d1fffebba205cb9" exitCode=0 Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.390378 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" event={"ID":"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e","Type":"ContainerDied","Data":"246ffe15dbba94feb95110ee0a41781f663ada7a4abb43652d1fffebba205cb9"} Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.466885 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb"] Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.473472 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.500221 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb"] Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.621269 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-4q4bb\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.621513 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46bmf\" (UniqueName: \"kubernetes.io/projected/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-kube-api-access-46bmf\") pod \"mysqld-exporter-openstack-cell1-db-create-4q4bb\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.662834 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-32e0-account-create-update-fr87w"] Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.664157 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.668934 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.672932 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.685455 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-32e0-account-create-update-fr87w"] Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.723365 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46bmf\" (UniqueName: \"kubernetes.io/projected/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-kube-api-access-46bmf\") pod \"mysqld-exporter-openstack-cell1-db-create-4q4bb\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.723850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-4q4bb\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.724837 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-4q4bb\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.752119 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46bmf\" (UniqueName: \"kubernetes.io/projected/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-kube-api-access-46bmf\") pod \"mysqld-exporter-openstack-cell1-db-create-4q4bb\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.806831 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.824918 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-dns-svc\") pod \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.825012 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-ovsdbserver-sb\") pod \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.825075 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-config\") pod \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.825216 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znppk\" (UniqueName: \"kubernetes.io/projected/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-kube-api-access-znppk\") pod \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\" (UID: \"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e\") " Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.825590 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlzk\" (UniqueName: \"kubernetes.io/projected/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-kube-api-access-swlzk\") pod \"mysqld-exporter-32e0-account-create-update-fr87w\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.825719 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-operator-scripts\") pod \"mysqld-exporter-32e0-account-create-update-fr87w\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.832886 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-kube-api-access-znppk" (OuterVolumeSpecName: "kube-api-access-znppk") pod "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" (UID: "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e"). InnerVolumeSpecName "kube-api-access-znppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.899794 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-config" (OuterVolumeSpecName: "config") pod "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" (UID: "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.900085 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" (UID: "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.928903 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" (UID: "75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.930188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-operator-scripts\") pod \"mysqld-exporter-32e0-account-create-update-fr87w\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.930348 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swlzk\" (UniqueName: \"kubernetes.io/projected/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-kube-api-access-swlzk\") pod \"mysqld-exporter-32e0-account-create-update-fr87w\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.930429 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.930447 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.930456 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znppk\" (UniqueName: \"kubernetes.io/projected/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-kube-api-access-znppk\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.930470 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.931842 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-operator-scripts\") pod \"mysqld-exporter-32e0-account-create-update-fr87w\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.973442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlzk\" (UniqueName: \"kubernetes.io/projected/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-kube-api-access-swlzk\") pod \"mysqld-exporter-32e0-account-create-update-fr87w\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:41 crc kubenswrapper[4762]: I0217 14:28:41.983358 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.411427 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" event={"ID":"75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e","Type":"ContainerDied","Data":"a2cbb03ad697a79ee14dced328e082da87373157cafbb1ebb8aee71e9f584e95"} Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.411998 4762 scope.go:117] "RemoveContainer" containerID="246ffe15dbba94feb95110ee0a41781f663ada7a4abb43652d1fffebba205cb9" Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.412174 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-2jm8z" Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.430915 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerStarted","Data":"1efb1c48ce3b3ab106a3f45c6541d341c2a89ee49959ea4a27eb069d425a42b8"} Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.434292 4762 generic.go:334] "Generic (PLEG): container finished" podID="cdc0cab3-27e5-462f-8b21-e97775f8f4b4" containerID="57831539b956592372abb05c0e8265ae6c1b0b4dbde3f14741138fed85b064b3" exitCode=0 Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.434328 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z5pp2" event={"ID":"cdc0cab3-27e5-462f-8b21-e97775f8f4b4","Type":"ContainerDied","Data":"57831539b956592372abb05c0e8265ae6c1b0b4dbde3f14741138fed85b064b3"} Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.450173 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-2jm8z"] Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.454876 4762 scope.go:117] "RemoveContainer" containerID="2dd0bfd50a92353c58b477696b8979a4f7277e4757894da2ea8addf23cf1ba42" Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.465553 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-2jm8z"] Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.528462 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb"] Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.732715 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-32e0-account-create-update-fr87w"] Feb 17 14:28:42 crc kubenswrapper[4762]: W0217 14:28:42.733062 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e0fb0bc_3e83_444f_8c0d_701c9e0ed873.slice/crio-bd4fc4908e4a32847ab2c1b6d605ac387023706c162290431353af3df47a339a WatchSource:0}: Error finding container bd4fc4908e4a32847ab2c1b6d605ac387023706c162290431353af3df47a339a: Status 404 returned error can't find the container with id bd4fc4908e4a32847ab2c1b6d605ac387023706c162290431353af3df47a339a Feb 17 14:28:42 crc kubenswrapper[4762]: I0217 14:28:42.918449 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xspft" podUID="0611dcb7-08c7-4999-8bc2-210224f89e66" containerName="ovn-controller" probeResult="failure" output=< Feb 17 14:28:42 crc kubenswrapper[4762]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 14:28:42 crc kubenswrapper[4762]: > Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.045773 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tt6cp"] Feb 17 14:28:43 crc kubenswrapper[4762]: E0217 14:28:43.046340 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerName="init" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.046367 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerName="init" Feb 17 14:28:43 crc kubenswrapper[4762]: E0217 14:28:43.046433 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerName="dnsmasq-dns" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.046444 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerName="dnsmasq-dns" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.046712 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" containerName="dnsmasq-dns" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.047816 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.050072 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.050761 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ckfnj" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.077533 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tt6cp"] Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.111501 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpj4\" (UniqueName: \"kubernetes.io/projected/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-kube-api-access-dvpj4\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.111582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-db-sync-config-data\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.111630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-combined-ca-bundle\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.116339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-config-data\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.191983 4762 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podde8fe6a0-5c88-434f-a653-ee334a757900"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podde8fe6a0-5c88-434f-a653-ee334a757900] : Timed out while waiting for systemd to remove kubepods-besteffort-podde8fe6a0_5c88_434f_a653_ee334a757900.slice" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.218840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpj4\" (UniqueName: \"kubernetes.io/projected/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-kube-api-access-dvpj4\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.218911 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-db-sync-config-data\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.218951 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-combined-ca-bundle\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.219015 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-config-data\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.227021 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-config-data\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.229509 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-combined-ca-bundle\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.233470 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-db-sync-config-data\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.251252 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpj4\" (UniqueName: \"kubernetes.io/projected/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-kube-api-access-dvpj4\") pod \"glance-db-sync-tt6cp\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.366308 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tt6cp" Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.463620 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" event={"ID":"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873","Type":"ContainerStarted","Data":"1d12a4cd06030465a4e1570620e4ca6e43f5d9d69b19757e8a38e91a258121ec"} Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.463693 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" event={"ID":"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873","Type":"ContainerStarted","Data":"bd4fc4908e4a32847ab2c1b6d605ac387023706c162290431353af3df47a339a"} Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.479606 4762 generic.go:334] "Generic (PLEG): container finished" podID="0270bd57-0aa6-48bf-98ed-d37d70fbb42c" containerID="a1440e9dafbe555aae2a489afab3b11a1e4730a420a470ef5f9c6ab1f6712e72" exitCode=0 Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.479900 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" event={"ID":"0270bd57-0aa6-48bf-98ed-d37d70fbb42c","Type":"ContainerDied","Data":"a1440e9dafbe555aae2a489afab3b11a1e4730a420a470ef5f9c6ab1f6712e72"} Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.480181 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" event={"ID":"0270bd57-0aa6-48bf-98ed-d37d70fbb42c","Type":"ContainerStarted","Data":"5ddd9d62487e20027eb4e435d5d03a899b601639e8ef11eade9c6e35d5b6e293"} Feb 17 14:28:43 crc kubenswrapper[4762]: I0217 14:28:43.532134 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" podStartSLOduration=2.532108841 podStartE2EDuration="2.532108841s" podCreationTimestamp="2026-02-17 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:43.48788478 +0000 UTC m=+1404.067885432" watchObservedRunningTime="2026-02-17 14:28:43.532108841 +0000 UTC m=+1404.112109493" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.101277 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e" path="/var/lib/kubelet/pods/75ab44d0-42b3-4e88-a8f0-c0d7f9ac680e/volumes" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.353548 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.449625 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-operator-scripts\") pod \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.449812 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zckzr\" (UniqueName: \"kubernetes.io/projected/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-kube-api-access-zckzr\") pod \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\" (UID: \"cdc0cab3-27e5-462f-8b21-e97775f8f4b4\") " Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.450245 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdc0cab3-27e5-462f-8b21-e97775f8f4b4" (UID: "cdc0cab3-27e5-462f-8b21-e97775f8f4b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.450588 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.450978 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.456213 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-kube-api-access-zckzr" (OuterVolumeSpecName: "kube-api-access-zckzr") pod "cdc0cab3-27e5-462f-8b21-e97775f8f4b4" (UID: "cdc0cab3-27e5-462f-8b21-e97775f8f4b4"). InnerVolumeSpecName "kube-api-access-zckzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.458637 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/466a7dc3-63d2-4995-ab6f-712df183303d-etc-swift\") pod \"swift-storage-0\" (UID: \"466a7dc3-63d2-4995-ab6f-712df183303d\") " pod="openstack/swift-storage-0" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.496774 4762 generic.go:334] "Generic (PLEG): container finished" podID="7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" containerID="1d12a4cd06030465a4e1570620e4ca6e43f5d9d69b19757e8a38e91a258121ec" exitCode=0 Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.496839 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" event={"ID":"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873","Type":"ContainerDied","Data":"1d12a4cd06030465a4e1570620e4ca6e43f5d9d69b19757e8a38e91a258121ec"} Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.513934 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tt6cp"] Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.552916 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zckzr\" (UniqueName: \"kubernetes.io/projected/cdc0cab3-27e5-462f-8b21-e97775f8f4b4-kube-api-access-zckzr\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.571336 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z5pp2" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.573701 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z5pp2" event={"ID":"cdc0cab3-27e5-462f-8b21-e97775f8f4b4","Type":"ContainerDied","Data":"70fd84a61ab82837aed2f362e1ba4323eaf0e542568e4fa4f6f44958ea3ddb40"} Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.573884 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70fd84a61ab82837aed2f362e1ba4323eaf0e542568e4fa4f6f44958ea3ddb40" Feb 17 14:28:44 crc kubenswrapper[4762]: I0217 14:28:44.624465 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.150501 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.280903 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-operator-scripts\") pod \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.281004 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46bmf\" (UniqueName: \"kubernetes.io/projected/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-kube-api-access-46bmf\") pod \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\" (UID: \"0270bd57-0aa6-48bf-98ed-d37d70fbb42c\") " Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.281898 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0270bd57-0aa6-48bf-98ed-d37d70fbb42c" (UID: "0270bd57-0aa6-48bf-98ed-d37d70fbb42c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.287529 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-kube-api-access-46bmf" (OuterVolumeSpecName: "kube-api-access-46bmf") pod "0270bd57-0aa6-48bf-98ed-d37d70fbb42c" (UID: "0270bd57-0aa6-48bf-98ed-d37d70fbb42c"). InnerVolumeSpecName "kube-api-access-46bmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.383458 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.383486 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46bmf\" (UniqueName: \"kubernetes.io/projected/0270bd57-0aa6-48bf-98ed-d37d70fbb42c-kube-api-access-46bmf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.538846 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.584031 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tt6cp" event={"ID":"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574","Type":"ContainerStarted","Data":"a8a8552e4bd0a4280ec3178c0314e6f76809e9713d6dffb2e53f1e6a110904e2"} Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.587964 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.588973 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb" event={"ID":"0270bd57-0aa6-48bf-98ed-d37d70fbb42c","Type":"ContainerDied","Data":"5ddd9d62487e20027eb4e435d5d03a899b601639e8ef11eade9c6e35d5b6e293"} Feb 17 14:28:45 crc kubenswrapper[4762]: I0217 14:28:45.589034 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ddd9d62487e20027eb4e435d5d03a899b601639e8ef11eade9c6e35d5b6e293" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.042783 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z5pp2"] Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.097209 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z5pp2"] Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.200986 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.312890 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swlzk\" (UniqueName: \"kubernetes.io/projected/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-kube-api-access-swlzk\") pod \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.312972 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-operator-scripts\") pod \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\" (UID: \"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873\") " Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.314150 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" (UID: "7e0fb0bc-3e83-444f-8c0d-701c9e0ed873"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.326692 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-kube-api-access-swlzk" (OuterVolumeSpecName: "kube-api-access-swlzk") pod "7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" (UID: "7e0fb0bc-3e83-444f-8c0d-701c9e0ed873"). InnerVolumeSpecName "kube-api-access-swlzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.417318 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swlzk\" (UniqueName: \"kubernetes.io/projected/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-kube-api-access-swlzk\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.417366 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.614512 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" event={"ID":"7e0fb0bc-3e83-444f-8c0d-701c9e0ed873","Type":"ContainerDied","Data":"bd4fc4908e4a32847ab2c1b6d605ac387023706c162290431353af3df47a339a"} Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.614558 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4fc4908e4a32847ab2c1b6d605ac387023706c162290431353af3df47a339a" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.614561 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-32e0-account-create-update-fr87w" Feb 17 14:28:46 crc kubenswrapper[4762]: I0217 14:28:46.621698 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"e8db3f17a8a28b07bef2a6281a34e1446102165ac60ae4731e3b3012ff0dd749"} Feb 17 14:28:47 crc kubenswrapper[4762]: I0217 14:28:47.910484 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xspft" podUID="0611dcb7-08c7-4999-8bc2-210224f89e66" containerName="ovn-controller" probeResult="failure" output=< Feb 17 14:28:47 crc kubenswrapper[4762]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 14:28:47 crc kubenswrapper[4762]: > Feb 17 14:28:47 crc kubenswrapper[4762]: I0217 14:28:47.955850 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:28:47 crc kubenswrapper[4762]: I0217 14:28:47.983028 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7gshj" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.093658 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc0cab3-27e5-462f-8b21-e97775f8f4b4" path="/var/lib/kubelet/pods/cdc0cab3-27e5-462f-8b21-e97775f8f4b4/volumes" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.348210 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xspft-config-gznbp"] Feb 17 14:28:48 crc kubenswrapper[4762]: E0217 14:28:48.348775 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0270bd57-0aa6-48bf-98ed-d37d70fbb42c" containerName="mariadb-database-create" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.348799 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0270bd57-0aa6-48bf-98ed-d37d70fbb42c" containerName="mariadb-database-create" Feb 17 14:28:48 crc kubenswrapper[4762]: E0217 14:28:48.348848 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" containerName="mariadb-account-create-update" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.348858 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" containerName="mariadb-account-create-update" Feb 17 14:28:48 crc kubenswrapper[4762]: E0217 14:28:48.348891 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc0cab3-27e5-462f-8b21-e97775f8f4b4" containerName="mariadb-account-create-update" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.348900 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc0cab3-27e5-462f-8b21-e97775f8f4b4" containerName="mariadb-account-create-update" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.349183 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc0cab3-27e5-462f-8b21-e97775f8f4b4" containerName="mariadb-account-create-update" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.349218 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" containerName="mariadb-account-create-update" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.349236 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0270bd57-0aa6-48bf-98ed-d37d70fbb42c" containerName="mariadb-database-create" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.350275 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.359339 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.364108 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-additional-scripts\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.364156 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzc7h\" (UniqueName: \"kubernetes.io/projected/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-kube-api-access-qzc7h\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.364333 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-log-ovn\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.364393 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.364449 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-scripts\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.365085 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run-ovn\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.374512 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xspft-config-gznbp"] Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467077 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-additional-scripts\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467121 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzc7h\" (UniqueName: \"kubernetes.io/projected/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-kube-api-access-qzc7h\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467153 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-log-ovn\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467179 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467205 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-scripts\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467296 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run-ovn\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467583 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run-ovn\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467970 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.467987 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-log-ovn\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.468380 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-additional-scripts\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.469897 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-scripts\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.497721 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzc7h\" (UniqueName: \"kubernetes.io/projected/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-kube-api-access-qzc7h\") pod \"ovn-controller-xspft-config-gznbp\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:48 crc kubenswrapper[4762]: I0217 14:28:48.676031 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.353054 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qhtwl"] Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.354950 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.356976 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.365423 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qhtwl"] Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.490109 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxpxd\" (UniqueName: \"kubernetes.io/projected/95a9e2ec-f495-439a-8329-ad40dd007430-kube-api-access-vxpxd\") pod \"root-account-create-update-qhtwl\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.490484 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a9e2ec-f495-439a-8329-ad40dd007430-operator-scripts\") pod \"root-account-create-update-qhtwl\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.572672 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.592827 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxpxd\" (UniqueName: \"kubernetes.io/projected/95a9e2ec-f495-439a-8329-ad40dd007430-kube-api-access-vxpxd\") pod \"root-account-create-update-qhtwl\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.593043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a9e2ec-f495-439a-8329-ad40dd007430-operator-scripts\") pod \"root-account-create-update-qhtwl\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.593902 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a9e2ec-f495-439a-8329-ad40dd007430-operator-scripts\") pod \"root-account-create-update-qhtwl\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.623797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxpxd\" (UniqueName: \"kubernetes.io/projected/95a9e2ec-f495-439a-8329-ad40dd007430-kube-api-access-vxpxd\") pod \"root-account-create-update-qhtwl\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:49 crc kubenswrapper[4762]: I0217 14:28:49.684377 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:50 crc kubenswrapper[4762]: I0217 14:28:50.697855 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerStarted","Data":"4d18515534cd887e69de1ddc03d6cdec0ccd05316ea6be2f3e0413c2722ef6f7"} Feb 17 14:28:50 crc kubenswrapper[4762]: I0217 14:28:50.706831 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"abda2034a8823f01ee849897ce6a86c2d682948b563518bc3077320f1ab4a1dc"} Feb 17 14:28:50 crc kubenswrapper[4762]: I0217 14:28:50.751968 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.724326715 podStartE2EDuration="1m10.751941108s" podCreationTimestamp="2026-02-17 14:27:40 +0000 UTC" firstStartedPulling="2026-02-17 14:28:02.116252325 +0000 UTC m=+1362.696252967" lastFinishedPulling="2026-02-17 14:28:50.143866708 +0000 UTC m=+1410.723867360" observedRunningTime="2026-02-17 14:28:50.742392768 +0000 UTC m=+1411.322393420" watchObservedRunningTime="2026-02-17 14:28:50.751941108 +0000 UTC m=+1411.331941760" Feb 17 14:28:50 crc kubenswrapper[4762]: I0217 14:28:50.779117 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xspft-config-gznbp"] Feb 17 14:28:50 crc kubenswrapper[4762]: I0217 14:28:50.792167 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qhtwl"] Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.722606 4762 generic.go:334] "Generic (PLEG): container finished" podID="95a9e2ec-f495-439a-8329-ad40dd007430" containerID="2374a3728cd95390711955d903773f7b4614b1795c447ce88c14d0a6d7eaaa26" exitCode=0 Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.722767 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhtwl" event={"ID":"95a9e2ec-f495-439a-8329-ad40dd007430","Type":"ContainerDied","Data":"2374a3728cd95390711955d903773f7b4614b1795c447ce88c14d0a6d7eaaa26"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.723301 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhtwl" event={"ID":"95a9e2ec-f495-439a-8329-ad40dd007430","Type":"ContainerStarted","Data":"40bc8135bda970be217f28cd30a606fe0485956c6eea23e3034f5f01f0742618"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.727350 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-gznbp" event={"ID":"1f27d77a-f6dc-4c0a-96e3-79176ce7819f","Type":"ContainerStarted","Data":"04011dc64b4c9f1f4b73753d11fcd7079b50ab16e9d738bd6611369fe1d52847"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.727399 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-gznbp" event={"ID":"1f27d77a-f6dc-4c0a-96e3-79176ce7819f","Type":"ContainerStarted","Data":"956f8c6eae411d89ed01020b7d8cb2b10593b72a6e8a7075c06e067911a099e4"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.733090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"d5cd6c5a274cfd80bad1441aa0622b1b8bcc4612ace9eb3dda6451e6fa4a47ea"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.733347 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"8c780b79a96d17d426f4caae8389297376b3b7a3c657f0060145d6a99a3e0c14"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.733433 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"46c0b63d3106b4baf654f333d505861f3d84de63e0d53e604fa3f5a211e03fec"} Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.773519 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.775410 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.778987 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.793176 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.826472 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xspft-config-gznbp" podStartSLOduration=3.826443987 podStartE2EDuration="3.826443987s" podCreationTimestamp="2026-02-17 14:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:51.795361973 +0000 UTC m=+1412.375362635" watchObservedRunningTime="2026-02-17 14:28:51.826443987 +0000 UTC m=+1412.406444639" Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.978186 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.978540 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-655dk\" (UniqueName: \"kubernetes.io/projected/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-kube-api-access-655dk\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:51 crc kubenswrapper[4762]: I0217 14:28:51.978566 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.043773 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.081829 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-655dk\" (UniqueName: \"kubernetes.io/projected/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-kube-api-access-655dk\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.081887 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.082091 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.091747 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.092416 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.129503 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-655dk\" (UniqueName: \"kubernetes.io/projected/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-kube-api-access-655dk\") pod \"mysqld-exporter-0\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.415597 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.750893 4762 generic.go:334] "Generic (PLEG): container finished" podID="1f27d77a-f6dc-4c0a-96e3-79176ce7819f" containerID="04011dc64b4c9f1f4b73753d11fcd7079b50ab16e9d738bd6611369fe1d52847" exitCode=0 Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.750964 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-gznbp" event={"ID":"1f27d77a-f6dc-4c0a-96e3-79176ce7819f","Type":"ContainerDied","Data":"04011dc64b4c9f1f4b73753d11fcd7079b50ab16e9d738bd6611369fe1d52847"} Feb 17 14:28:52 crc kubenswrapper[4762]: I0217 14:28:52.939664 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xspft" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.103146 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:28:53 crc kubenswrapper[4762]: W0217 14:28:53.374722 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee9b9ac0_7ac0_421a_a94d_8b25a433e7e2.slice/crio-ae9451183557f75a2b0627cf76735216c702a245f2b97d42a3464d54f14ea026 WatchSource:0}: Error finding container ae9451183557f75a2b0627cf76735216c702a245f2b97d42a3464d54f14ea026: Status 404 returned error can't find the container with id ae9451183557f75a2b0627cf76735216c702a245f2b97d42a3464d54f14ea026 Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.591830 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.737122 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a9e2ec-f495-439a-8329-ad40dd007430-operator-scripts\") pod \"95a9e2ec-f495-439a-8329-ad40dd007430\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.737625 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxpxd\" (UniqueName: \"kubernetes.io/projected/95a9e2ec-f495-439a-8329-ad40dd007430-kube-api-access-vxpxd\") pod \"95a9e2ec-f495-439a-8329-ad40dd007430\" (UID: \"95a9e2ec-f495-439a-8329-ad40dd007430\") " Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.738378 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a9e2ec-f495-439a-8329-ad40dd007430-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95a9e2ec-f495-439a-8329-ad40dd007430" (UID: "95a9e2ec-f495-439a-8329-ad40dd007430"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.743910 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a9e2ec-f495-439a-8329-ad40dd007430-kube-api-access-vxpxd" (OuterVolumeSpecName: "kube-api-access-vxpxd") pod "95a9e2ec-f495-439a-8329-ad40dd007430" (UID: "95a9e2ec-f495-439a-8329-ad40dd007430"). InnerVolumeSpecName "kube-api-access-vxpxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.749365 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-66nlq"] Feb 17 14:28:53 crc kubenswrapper[4762]: E0217 14:28:53.749973 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a9e2ec-f495-439a-8329-ad40dd007430" containerName="mariadb-account-create-update" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.749993 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a9e2ec-f495-439a-8329-ad40dd007430" containerName="mariadb-account-create-update" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.750201 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a9e2ec-f495-439a-8329-ad40dd007430" containerName="mariadb-account-create-update" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.751784 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.785417 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66nlq"] Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.797804 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhtwl" event={"ID":"95a9e2ec-f495-439a-8329-ad40dd007430","Type":"ContainerDied","Data":"40bc8135bda970be217f28cd30a606fe0485956c6eea23e3034f5f01f0742618"} Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.797868 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40bc8135bda970be217f28cd30a606fe0485956c6eea23e3034f5f01f0742618" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.797967 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhtwl" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.807968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"4e78127c86876beff55abe9f22519bc9472f599dbbcf5f5e6218dbc190e276e6"} Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.840430 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a9e2ec-f495-439a-8329-ad40dd007430-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.840460 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxpxd\" (UniqueName: \"kubernetes.io/projected/95a9e2ec-f495-439a-8329-ad40dd007430-kube-api-access-vxpxd\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.842744 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2","Type":"ContainerStarted","Data":"ae9451183557f75a2b0627cf76735216c702a245f2b97d42a3464d54f14ea026"} Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.942912 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-utilities\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.942976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-catalog-content\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:53 crc kubenswrapper[4762]: I0217 14:28:53.943071 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgzw\" (UniqueName: \"kubernetes.io/projected/b1b8d793-bf38-4c87-8830-21b7dc5ad129-kube-api-access-wsgzw\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.045521 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-catalog-content\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.045615 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgzw\" (UniqueName: \"kubernetes.io/projected/b1b8d793-bf38-4c87-8830-21b7dc5ad129-kube-api-access-wsgzw\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.045822 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-utilities\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.046317 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-utilities\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.046602 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-catalog-content\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.081433 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgzw\" (UniqueName: \"kubernetes.io/projected/b1b8d793-bf38-4c87-8830-21b7dc5ad129-kube-api-access-wsgzw\") pod \"redhat-operators-66nlq\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.109204 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.208828 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="12862d08-7816-4a6d-9a52-aceeae5e1d8e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.127:5671: connect: connection refused" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.517248 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="d23bccd7-14f7-419d-95db-38470afb02b0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.543373 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="391886d8-341f-4e66-980c-00f6cd881e10" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.627473 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6c34ffbd-b33d-4579-8a4d-a51ef852b1a1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.755166 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66nlq"] Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.868663 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"9494b82efcc79deb18786ae229c5e9e317c9e5bcc5c25dbe4bd1dd18078e8688"} Feb 17 14:28:54 crc kubenswrapper[4762]: I0217 14:28:54.868726 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"d7a58e6af5b0eb17c2361caecf2c4d112c608b690e9ef8ed0e043b40210db654"} Feb 17 14:28:55 crc kubenswrapper[4762]: W0217 14:28:55.483237 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1b8d793_bf38_4c87_8830_21b7dc5ad129.slice/crio-a891055456b5d44d8d88ca49c1b18f0a38ab368180609450008092bdb9761cc1 WatchSource:0}: Error finding container a891055456b5d44d8d88ca49c1b18f0a38ab368180609450008092bdb9761cc1: Status 404 returned error can't find the container with id a891055456b5d44d8d88ca49c1b18f0a38ab368180609450008092bdb9761cc1 Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.712654 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803024 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run-ovn\") pod \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803105 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzc7h\" (UniqueName: \"kubernetes.io/projected/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-kube-api-access-qzc7h\") pod \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803191 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1f27d77a-f6dc-4c0a-96e3-79176ce7819f" (UID: "1f27d77a-f6dc-4c0a-96e3-79176ce7819f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803303 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-additional-scripts\") pod \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803365 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-scripts\") pod \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803387 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run\") pod \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.803473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-log-ovn\") pod \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\" (UID: \"1f27d77a-f6dc-4c0a-96e3-79176ce7819f\") " Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.804060 4762 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.804185 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1f27d77a-f6dc-4c0a-96e3-79176ce7819f" (UID: "1f27d77a-f6dc-4c0a-96e3-79176ce7819f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.804358 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run" (OuterVolumeSpecName: "var-run") pod "1f27d77a-f6dc-4c0a-96e3-79176ce7819f" (UID: "1f27d77a-f6dc-4c0a-96e3-79176ce7819f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.804914 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1f27d77a-f6dc-4c0a-96e3-79176ce7819f" (UID: "1f27d77a-f6dc-4c0a-96e3-79176ce7819f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.807908 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-scripts" (OuterVolumeSpecName: "scripts") pod "1f27d77a-f6dc-4c0a-96e3-79176ce7819f" (UID: "1f27d77a-f6dc-4c0a-96e3-79176ce7819f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.814595 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-kube-api-access-qzc7h" (OuterVolumeSpecName: "kube-api-access-qzc7h") pod "1f27d77a-f6dc-4c0a-96e3-79176ce7819f" (UID: "1f27d77a-f6dc-4c0a-96e3-79176ce7819f"). InnerVolumeSpecName "kube-api-access-qzc7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.884704 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerStarted","Data":"35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d"} Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.884765 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerStarted","Data":"a891055456b5d44d8d88ca49c1b18f0a38ab368180609450008092bdb9761cc1"} Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.886629 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-gznbp" event={"ID":"1f27d77a-f6dc-4c0a-96e3-79176ce7819f","Type":"ContainerDied","Data":"956f8c6eae411d89ed01020b7d8cb2b10593b72a6e8a7075c06e067911a099e4"} Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.886701 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956f8c6eae411d89ed01020b7d8cb2b10593b72a6e8a7075c06e067911a099e4" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.886794 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-gznbp" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.910029 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.910071 4762 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.910084 4762 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.910097 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzc7h\" (UniqueName: \"kubernetes.io/projected/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-kube-api-access-qzc7h\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:55 crc kubenswrapper[4762]: I0217 14:28:55.910110 4762 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f27d77a-f6dc-4c0a-96e3-79176ce7819f-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.132793 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qhtwl"] Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.146136 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qhtwl"] Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.823605 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xspft-config-gznbp"] Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.834491 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xspft-config-gznbp"] Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.902285 4762 generic.go:334] "Generic (PLEG): container finished" podID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerID="35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d" exitCode=0 Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.902345 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerDied","Data":"35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d"} Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.911669 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"91d9d3c7dbdf090b81f89d4546808e8d1a78e8d26d85a254865700149e078d1f"} Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.914249 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2","Type":"ContainerStarted","Data":"16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b"} Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.931236 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xspft-config-vz647"] Feb 17 14:28:56 crc kubenswrapper[4762]: E0217 14:28:56.931864 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f27d77a-f6dc-4c0a-96e3-79176ce7819f" containerName="ovn-config" Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.931895 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f27d77a-f6dc-4c0a-96e3-79176ce7819f" containerName="ovn-config" Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.932164 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f27d77a-f6dc-4c0a-96e3-79176ce7819f" containerName="ovn-config" Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.932934 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:56 crc kubenswrapper[4762]: I0217 14:28:56.935737 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.020408 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xspft-config-vz647"] Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.040570 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-log-ovn\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.040695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.040780 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run-ovn\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.040877 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-additional-scripts\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.040940 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trt2v\" (UniqueName: \"kubernetes.io/projected/395d7b35-d540-4222-8009-d29b24d0f1be-kube-api-access-trt2v\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.041135 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-scripts\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.043505 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.049471 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.059568 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.904798856 podStartE2EDuration="6.059544392s" podCreationTimestamp="2026-02-17 14:28:51 +0000 UTC" firstStartedPulling="2026-02-17 14:28:53.383472396 +0000 UTC m=+1413.963473048" lastFinishedPulling="2026-02-17 14:28:55.538217932 +0000 UTC m=+1416.118218584" observedRunningTime="2026-02-17 14:28:56.944701392 +0000 UTC m=+1417.524702034" watchObservedRunningTime="2026-02-17 14:28:57.059544392 +0000 UTC m=+1417.639545044" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.145792 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-log-ovn\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.145947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146008 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run-ovn\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-additional-scripts\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146116 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trt2v\" (UniqueName: \"kubernetes.io/projected/395d7b35-d540-4222-8009-d29b24d0f1be-kube-api-access-trt2v\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146186 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run-ovn\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146193 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-log-ovn\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146255 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.146429 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-scripts\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.147339 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-additional-scripts\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.148789 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-scripts\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.168098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trt2v\" (UniqueName: \"kubernetes.io/projected/395d7b35-d540-4222-8009-d29b24d0f1be-kube-api-access-trt2v\") pod \"ovn-controller-xspft-config-vz647\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.258396 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.833577 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xspft-config-vz647"] Feb 17 14:28:57 crc kubenswrapper[4762]: I0217 14:28:57.949294 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:58 crc kubenswrapper[4762]: I0217 14:28:58.097128 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f27d77a-f6dc-4c0a-96e3-79176ce7819f" path="/var/lib/kubelet/pods/1f27d77a-f6dc-4c0a-96e3-79176ce7819f/volumes" Feb 17 14:28:58 crc kubenswrapper[4762]: I0217 14:28:58.098167 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a9e2ec-f495-439a-8329-ad40dd007430" path="/var/lib/kubelet/pods/95a9e2ec-f495-439a-8329-ad40dd007430/volumes" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.458892 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zvgmb"] Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.460753 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.463881 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.475693 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zvgmb"] Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.499793 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-operator-scripts\") pod \"root-account-create-update-zvgmb\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.499872 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqt5m\" (UniqueName: \"kubernetes.io/projected/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-kube-api-access-mqt5m\") pod \"root-account-create-update-zvgmb\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.601267 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-operator-scripts\") pod \"root-account-create-update-zvgmb\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.601319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqt5m\" (UniqueName: \"kubernetes.io/projected/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-kube-api-access-mqt5m\") pod \"root-account-create-update-zvgmb\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.602238 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-operator-scripts\") pod \"root-account-create-update-zvgmb\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.622408 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqt5m\" (UniqueName: \"kubernetes.io/projected/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-kube-api-access-mqt5m\") pod \"root-account-create-update-zvgmb\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " pod="openstack/root-account-create-update-zvgmb" Feb 17 14:28:59 crc kubenswrapper[4762]: I0217 14:28:59.786192 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvgmb" Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.160605 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.161295 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="prometheus" containerID="cri-o://3cd041b3d46bc24d231294c9e613858fe5c95b7ae71f17e4af6727b51ee49c66" gracePeriod=600 Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.161452 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="thanos-sidecar" containerID="cri-o://4d18515534cd887e69de1ddc03d6cdec0ccd05316ea6be2f3e0413c2722ef6f7" gracePeriod=600 Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.161500 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="config-reloader" containerID="cri-o://1efb1c48ce3b3ab106a3f45c6541d341c2a89ee49959ea4a27eb069d425a42b8" gracePeriod=600 Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.966061 4762 generic.go:334] "Generic (PLEG): container finished" podID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerID="4d18515534cd887e69de1ddc03d6cdec0ccd05316ea6be2f3e0413c2722ef6f7" exitCode=0 Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.966106 4762 generic.go:334] "Generic (PLEG): container finished" podID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerID="1efb1c48ce3b3ab106a3f45c6541d341c2a89ee49959ea4a27eb069d425a42b8" exitCode=0 Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.966114 4762 generic.go:334] "Generic (PLEG): container finished" podID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerID="3cd041b3d46bc24d231294c9e613858fe5c95b7ae71f17e4af6727b51ee49c66" exitCode=0 Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.966146 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerDied","Data":"4d18515534cd887e69de1ddc03d6cdec0ccd05316ea6be2f3e0413c2722ef6f7"} Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.966195 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerDied","Data":"1efb1c48ce3b3ab106a3f45c6541d341c2a89ee49959ea4a27eb069d425a42b8"} Feb 17 14:29:00 crc kubenswrapper[4762]: I0217 14:29:00.966206 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerDied","Data":"3cd041b3d46bc24d231294c9e613858fe5c95b7ae71f17e4af6727b51ee49c66"} Feb 17 14:29:02 crc kubenswrapper[4762]: I0217 14:29:02.049446 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.135:9090/-/ready\": dial tcp 10.217.0.135:9090: connect: connection refused" Feb 17 14:29:04 crc kubenswrapper[4762]: I0217 14:29:04.204916 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 14:29:04 crc kubenswrapper[4762]: I0217 14:29:04.525690 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Feb 17 14:29:04 crc kubenswrapper[4762]: I0217 14:29:04.547474 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Feb 17 14:29:04 crc kubenswrapper[4762]: I0217 14:29:04.628030 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:29:06 crc kubenswrapper[4762]: W0217 14:29:06.810743 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod395d7b35_d540_4222_8009_d29b24d0f1be.slice/crio-59d6d905296e1e83e9b050621d5beb1ae67987367605c68ad0ac3a55769740b4 WatchSource:0}: Error finding container 59d6d905296e1e83e9b050621d5beb1ae67987367605c68ad0ac3a55769740b4: Status 404 returned error can't find the container with id 59d6d905296e1e83e9b050621d5beb1ae67987367605c68ad0ac3a55769740b4 Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.036997 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-vz647" event={"ID":"395d7b35-d540-4222-8009-d29b24d0f1be","Type":"ContainerStarted","Data":"59d6d905296e1e83e9b050621d5beb1ae67987367605c68ad0ac3a55769740b4"} Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.422189 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.526781 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.526823 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-tls-assets\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.526887 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-1\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.526912 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-web-config\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.526951 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-thanos-prometheus-http-client-file\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.527061 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config-out\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.527159 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-2\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.527267 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-0\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.527501 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.527539 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9mp\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-kube-api-access-bx9mp\") pod \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\" (UID: \"80db8f3d-cc50-4a3e-8cad-52f614221b4d\") " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.531229 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.531253 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.536906 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.545821 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config-out" (OuterVolumeSpecName: "config-out") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.554237 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.562969 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.568221 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config" (OuterVolumeSpecName: "config") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.568224 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-kube-api-access-bx9mp" (OuterVolumeSpecName: "kube-api-access-bx9mp") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "kube-api-access-bx9mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635065 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635095 4762 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635105 4762 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635116 4762 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635125 4762 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80db8f3d-cc50-4a3e-8cad-52f614221b4d-config-out\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635135 4762 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635144 4762 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80db8f3d-cc50-4a3e-8cad-52f614221b4d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.635152 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9mp\" (UniqueName: \"kubernetes.io/projected/80db8f3d-cc50-4a3e-8cad-52f614221b4d-kube-api-access-bx9mp\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.715894 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-web-config" (OuterVolumeSpecName: "web-config") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.744918 4762 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80db8f3d-cc50-4a3e-8cad-52f614221b4d-web-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.812768 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "80db8f3d-cc50-4a3e-8cad-52f614221b4d" (UID: "80db8f3d-cc50-4a3e-8cad-52f614221b4d"). InnerVolumeSpecName "pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.846968 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") on node \"crc\" " Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.949319 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:29:07 crc kubenswrapper[4762]: I0217 14:29:07.949516 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10") on node "crc" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.051943 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.128706 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.131684 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerStarted","Data":"3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba"} Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.131879 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80db8f3d-cc50-4a3e-8cad-52f614221b4d","Type":"ContainerDied","Data":"ea238ac7460842a43b0355902aebd50619903e918c2c80fb84a477ab2ce9c7f9"} Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.131974 4762 scope.go:117] "RemoveContainer" containerID="4d18515534cd887e69de1ddc03d6cdec0ccd05316ea6be2f3e0413c2722ef6f7" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.233819 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lrcjs"] Feb 17 14:29:08 crc kubenswrapper[4762]: E0217 14:29:08.238050 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="config-reloader" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.238087 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="config-reloader" Feb 17 14:29:08 crc kubenswrapper[4762]: E0217 14:29:08.238106 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="prometheus" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.238114 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="prometheus" Feb 17 14:29:08 crc kubenswrapper[4762]: E0217 14:29:08.238210 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="thanos-sidecar" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.239219 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="thanos-sidecar" Feb 17 14:29:08 crc kubenswrapper[4762]: E0217 14:29:08.239275 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="init-config-reloader" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.239288 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="init-config-reloader" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.240774 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="prometheus" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.240822 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="thanos-sidecar" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.240851 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="config-reloader" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.242941 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.286483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5gbq\" (UniqueName: \"kubernetes.io/projected/93fb932d-6901-44d9-a508-a32692308154-kube-api-access-l5gbq\") pod \"cinder-db-create-lrcjs\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.286592 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93fb932d-6901-44d9-a508-a32692308154-operator-scripts\") pod \"cinder-db-create-lrcjs\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.295799 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zvgmb"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.342897 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lrcjs"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.355740 4762 scope.go:117] "RemoveContainer" containerID="1efb1c48ce3b3ab106a3f45c6541d341c2a89ee49959ea4a27eb069d425a42b8" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.377169 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a355-account-create-update-wzz5t"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.378703 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.381027 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.399299 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee986585-bdb5-4bed-8002-7cf0a80784a8-operator-scripts\") pod \"cinder-a355-account-create-update-wzz5t\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.399396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93fb932d-6901-44d9-a508-a32692308154-operator-scripts\") pod \"cinder-db-create-lrcjs\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.399476 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2jb\" (UniqueName: \"kubernetes.io/projected/ee986585-bdb5-4bed-8002-7cf0a80784a8-kube-api-access-xn2jb\") pod \"cinder-a355-account-create-update-wzz5t\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.399624 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5gbq\" (UniqueName: \"kubernetes.io/projected/93fb932d-6901-44d9-a508-a32692308154-kube-api-access-l5gbq\") pod \"cinder-db-create-lrcjs\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.400752 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93fb932d-6901-44d9-a508-a32692308154-operator-scripts\") pod \"cinder-db-create-lrcjs\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.404699 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a355-account-create-update-wzz5t"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.445297 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-z944d"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.447146 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.466993 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5gbq\" (UniqueName: \"kubernetes.io/projected/93fb932d-6901-44d9-a508-a32692308154-kube-api-access-l5gbq\") pod \"cinder-db-create-lrcjs\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.469737 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-z944d"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.506263 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc86k\" (UniqueName: \"kubernetes.io/projected/d8300c70-e571-49c5-a403-d645237d7012-kube-api-access-mc86k\") pod \"heat-db-create-z944d\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.506333 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2jb\" (UniqueName: \"kubernetes.io/projected/ee986585-bdb5-4bed-8002-7cf0a80784a8-kube-api-access-xn2jb\") pod \"cinder-a355-account-create-update-wzz5t\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.506483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee986585-bdb5-4bed-8002-7cf0a80784a8-operator-scripts\") pod \"cinder-a355-account-create-update-wzz5t\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.506536 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8300c70-e571-49c5-a403-d645237d7012-operator-scripts\") pod \"heat-db-create-z944d\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.507406 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee986585-bdb5-4bed-8002-7cf0a80784a8-operator-scripts\") pod \"cinder-a355-account-create-update-wzz5t\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.512001 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.586005 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2jb\" (UniqueName: \"kubernetes.io/projected/ee986585-bdb5-4bed-8002-7cf0a80784a8-kube-api-access-xn2jb\") pod \"cinder-a355-account-create-update-wzz5t\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.586330 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.589387 4762 scope.go:117] "RemoveContainer" containerID="3cd041b3d46bc24d231294c9e613858fe5c95b7ae71f17e4af6727b51ee49c66" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.613608 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8300c70-e571-49c5-a403-d645237d7012-operator-scripts\") pod \"heat-db-create-z944d\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.614014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc86k\" (UniqueName: \"kubernetes.io/projected/d8300c70-e571-49c5-a403-d645237d7012-kube-api-access-mc86k\") pod \"heat-db-create-z944d\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.619102 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8300c70-e571-49c5-a403-d645237d7012-operator-scripts\") pod \"heat-db-create-z944d\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.630489 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.646055 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc86k\" (UniqueName: \"kubernetes.io/projected/d8300c70-e571-49c5-a403-d645237d7012-kube-api-access-mc86k\") pod \"heat-db-create-z944d\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.662281 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-8332-account-create-update-8vvzv"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.664132 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.666828 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.669058 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.676352 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.676688 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.676877 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xlgsb" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.677049 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.677188 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.688880 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.688958 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.695267 4762 scope.go:117] "RemoveContainer" containerID="26eac05bc40a7e99203d2d5e5eda0e1ea377002924f146a145f67079e2beb4d3" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.699050 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.699286 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.704806 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.713500 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.716037 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.782151 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-8332-account-create-update-8vvzv"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.818123 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7wqqm"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.819606 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.821555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-config\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.821619 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.821662 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.821714 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.821742 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.821771 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43ed625c-d879-4409-9450-d61b3f7cc686-operator-scripts\") pod \"heat-8332-account-create-update-8vvzv\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.822738 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.822795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.822861 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bad07381-6a78-4418-b451-0521ee7d95f9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.822935 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bad07381-6a78-4418-b451-0521ee7d95f9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.822988 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.823058 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.823245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.823317 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65ssz\" (UniqueName: \"kubernetes.io/projected/43ed625c-d879-4409-9450-d61b3f7cc686-kube-api-access-65ssz\") pod \"heat-8332-account-create-update-8vvzv\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.823339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2g5t\" (UniqueName: \"kubernetes.io/projected/bad07381-6a78-4418-b451-0521ee7d95f9-kube-api-access-p2g5t\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.862470 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7wqqm"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.883258 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-q6l4w"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.884712 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.889679 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.890068 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.890213 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.890357 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jgkd7" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.900158 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q6l4w"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.910025 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z944d" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925426 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925532 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65ssz\" (UniqueName: \"kubernetes.io/projected/43ed625c-d879-4409-9450-d61b3f7cc686-kube-api-access-65ssz\") pod \"heat-8332-account-create-update-8vvzv\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925594 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2g5t\" (UniqueName: \"kubernetes.io/projected/bad07381-6a78-4418-b451-0521ee7d95f9-kube-api-access-p2g5t\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925702 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b691b6d-c42b-491d-a1d0-3c5cb236598b-operator-scripts\") pod \"barbican-db-create-7wqqm\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-config\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.925943 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926034 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926096 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43ed625c-d879-4409-9450-d61b3f7cc686-operator-scripts\") pod \"heat-8332-account-create-update-8vvzv\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926205 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw6b6\" (UniqueName: \"kubernetes.io/projected/3b691b6d-c42b-491d-a1d0-3c5cb236598b-kube-api-access-rw6b6\") pod \"barbican-db-create-7wqqm\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926389 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926449 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926530 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bad07381-6a78-4418-b451-0521ee7d95f9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926628 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bad07381-6a78-4418-b451-0521ee7d95f9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.926845 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.928503 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.933991 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.934561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43ed625c-d879-4409-9450-d61b3f7cc686-operator-scripts\") pod \"heat-8332-account-create-update-8vvzv\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.934679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.934729 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-be62-account-create-update-sl2zr"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.935001 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-config\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.935555 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bad07381-6a78-4418-b451-0521ee7d95f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.936067 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.936205 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.944691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.947123 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.949884 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tvd94"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.951797 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.957114 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bad07381-6a78-4418-b451-0521ee7d95f9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.957202 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.958693 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bad07381-6a78-4418-b451-0521ee7d95f9-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.960701 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-be62-account-create-update-sl2zr"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.960985 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bad07381-6a78-4418-b451-0521ee7d95f9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.968763 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2g5t\" (UniqueName: \"kubernetes.io/projected/bad07381-6a78-4418-b451-0521ee7d95f9-kube-api-access-p2g5t\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.969689 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.969747 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a14786d82eecf667a32c06b804e4be54e2c76b1ecf1137b60c795c6a56a8bc4a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.992490 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tvd94"] Feb 17 14:29:08 crc kubenswrapper[4762]: I0217 14:29:08.997227 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65ssz\" (UniqueName: \"kubernetes.io/projected/43ed625c-d879-4409-9450-d61b3f7cc686-kube-api-access-65ssz\") pod \"heat-8332-account-create-update-8vvzv\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.024326 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b315-account-create-update-nnnmm"] Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.026019 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.032076 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033694 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw6b6\" (UniqueName: \"kubernetes.io/projected/3b691b6d-c42b-491d-a1d0-3c5cb236598b-kube-api-access-rw6b6\") pod \"barbican-db-create-7wqqm\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033769 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzrnp\" (UniqueName: \"kubernetes.io/projected/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-kube-api-access-wzrnp\") pod \"neutron-db-create-tvd94\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033839 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-combined-ca-bundle\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5fcm\" (UniqueName: \"kubernetes.io/projected/8acf7e9f-6215-417b-b385-68b30decf4c8-kube-api-access-n5fcm\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033917 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-operator-scripts\") pod \"neutron-db-create-tvd94\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033952 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3e6eca-01ec-4a72-b83c-80183169dbf1-operator-scripts\") pod \"neutron-be62-account-create-update-sl2zr\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033971 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-config-data\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.033990 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b691b6d-c42b-491d-a1d0-3c5cb236598b-operator-scripts\") pod \"barbican-db-create-7wqqm\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.034036 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbdz\" (UniqueName: \"kubernetes.io/projected/cb3e6eca-01ec-4a72-b83c-80183169dbf1-kube-api-access-glbdz\") pod \"neutron-be62-account-create-update-sl2zr\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.035139 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b691b6d-c42b-491d-a1d0-3c5cb236598b-operator-scripts\") pod \"barbican-db-create-7wqqm\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.035235 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.049401 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b315-account-create-update-nnnmm"] Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.053091 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-945d1be9-e80f-4733-a8dc-8bc8d124eb10\") pod \"prometheus-metric-storage-0\" (UID: \"bad07381-6a78-4418-b451-0521ee7d95f9\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.211799 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-vz647" event={"ID":"395d7b35-d540-4222-8009-d29b24d0f1be","Type":"ContainerStarted","Data":"5d0df22f7fd59f68d826d32d34c1cbd872159e007a31d5f544c8ef3bc6f3e281"} Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.219835 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvgmb" event={"ID":"e8bc1c0d-6392-40df-a3e9-3800d78b8a46","Type":"ContainerStarted","Data":"630e37dab7f019f6a2702f87903daaf8a2d343b5f5d4e2a8a3d76495731261c0"} Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.219896 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvgmb" event={"ID":"e8bc1c0d-6392-40df-a3e9-3800d78b8a46","Type":"ContainerStarted","Data":"0f6b07184d4ec7e0e77f730fdacc51fe2d3c82739f93a456f57cbec130722f4f"} Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.230291 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw6b6\" (UniqueName: \"kubernetes.io/projected/3b691b6d-c42b-491d-a1d0-3c5cb236598b-kube-api-access-rw6b6\") pod \"barbican-db-create-7wqqm\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.242911 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"08d864f7c17da00d20bdea198e6389782d28bb7c716674e94548b05d55a67ba1"} Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.286230 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrnp\" (UniqueName: \"kubernetes.io/projected/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-kube-api-access-wzrnp\") pod \"neutron-db-create-tvd94\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.286373 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-combined-ca-bundle\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.289478 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5fcm\" (UniqueName: \"kubernetes.io/projected/8acf7e9f-6215-417b-b385-68b30decf4c8-kube-api-access-n5fcm\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.289803 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-operator-scripts\") pod \"neutron-db-create-tvd94\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.289920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3e6eca-01ec-4a72-b83c-80183169dbf1-operator-scripts\") pod \"neutron-be62-account-create-update-sl2zr\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.289960 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-config-data\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.290177 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glbdz\" (UniqueName: \"kubernetes.io/projected/cb3e6eca-01ec-4a72-b83c-80183169dbf1-kube-api-access-glbdz\") pod \"neutron-be62-account-create-update-sl2zr\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.294391 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-operator-scripts\") pod \"neutron-db-create-tvd94\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.294509 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-combined-ca-bundle\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.298693 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-config-data\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.312403 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3e6eca-01ec-4a72-b83c-80183169dbf1-operator-scripts\") pod \"neutron-be62-account-create-update-sl2zr\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.321477 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.364551 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzrnp\" (UniqueName: \"kubernetes.io/projected/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-kube-api-access-wzrnp\") pod \"neutron-db-create-tvd94\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.366377 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5fcm\" (UniqueName: \"kubernetes.io/projected/8acf7e9f-6215-417b-b385-68b30decf4c8-kube-api-access-n5fcm\") pod \"keystone-db-sync-q6l4w\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.416958 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbdz\" (UniqueName: \"kubernetes.io/projected/cb3e6eca-01ec-4a72-b83c-80183169dbf1-kube-api-access-glbdz\") pod \"neutron-be62-account-create-update-sl2zr\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.419286 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.424494 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad6e8de-6bb3-4a3e-b664-db44abab1875-operator-scripts\") pod \"barbican-b315-account-create-update-nnnmm\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.424583 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqdv\" (UniqueName: \"kubernetes.io/projected/8ad6e8de-6bb3-4a3e-b664-db44abab1875-kube-api-access-glqdv\") pod \"barbican-b315-account-create-update-nnnmm\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.427034 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.476399 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.529550 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.535735 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad6e8de-6bb3-4a3e-b664-db44abab1875-operator-scripts\") pod \"barbican-b315-account-create-update-nnnmm\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.535797 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqdv\" (UniqueName: \"kubernetes.io/projected/8ad6e8de-6bb3-4a3e-b664-db44abab1875-kube-api-access-glqdv\") pod \"barbican-b315-account-create-update-nnnmm\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.577052 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xspft-config-vz647" podStartSLOduration=13.577028725 podStartE2EDuration="13.577028725s" podCreationTimestamp="2026-02-17 14:28:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:09.256430866 +0000 UTC m=+1429.836431508" watchObservedRunningTime="2026-02-17 14:29:09.577028725 +0000 UTC m=+1430.157029377" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.580395 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad6e8de-6bb3-4a3e-b664-db44abab1875-operator-scripts\") pod \"barbican-b315-account-create-update-nnnmm\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.598713 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqdv\" (UniqueName: \"kubernetes.io/projected/8ad6e8de-6bb3-4a3e-b664-db44abab1875-kube-api-access-glqdv\") pod \"barbican-b315-account-create-update-nnnmm\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.621363 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zvgmb" podStartSLOduration=10.621336128 podStartE2EDuration="10.621336128s" podCreationTimestamp="2026-02-17 14:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:09.323173279 +0000 UTC m=+1429.903173941" watchObservedRunningTime="2026-02-17 14:29:09.621336128 +0000 UTC m=+1430.201336780" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.847091 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.879721 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a355-account-create-update-wzz5t"] Feb 17 14:29:09 crc kubenswrapper[4762]: I0217 14:29:09.886852 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lrcjs"] Feb 17 14:29:09 crc kubenswrapper[4762]: W0217 14:29:09.974638 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee986585_bdb5_4bed_8002_7cf0a80784a8.slice/crio-cfc8636ca3d1e76630efb7f349669e7256877bcdb250734a9da6cba02cccc616 WatchSource:0}: Error finding container cfc8636ca3d1e76630efb7f349669e7256877bcdb250734a9da6cba02cccc616: Status 404 returned error can't find the container with id cfc8636ca3d1e76630efb7f349669e7256877bcdb250734a9da6cba02cccc616 Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.046744 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.135:9090/-/ready\": dial tcp 10.217.0.135:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.155494 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80db8f3d-cc50-4a3e-8cad-52f614221b4d" path="/var/lib/kubelet/pods/80db8f3d-cc50-4a3e-8cad-52f614221b4d/volumes" Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.180980 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-z944d"] Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.408714 4762 generic.go:334] "Generic (PLEG): container finished" podID="395d7b35-d540-4222-8009-d29b24d0f1be" containerID="5d0df22f7fd59f68d826d32d34c1cbd872159e007a31d5f544c8ef3bc6f3e281" exitCode=0 Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.410370 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-vz647" event={"ID":"395d7b35-d540-4222-8009-d29b24d0f1be","Type":"ContainerDied","Data":"5d0df22f7fd59f68d826d32d34c1cbd872159e007a31d5f544c8ef3bc6f3e281"} Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.443244 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a355-account-create-update-wzz5t" event={"ID":"ee986585-bdb5-4bed-8002-7cf0a80784a8","Type":"ContainerStarted","Data":"cfc8636ca3d1e76630efb7f349669e7256877bcdb250734a9da6cba02cccc616"} Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.445948 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-8332-account-create-update-8vvzv"] Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.467658 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tt6cp" event={"ID":"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574","Type":"ContainerStarted","Data":"6891113cf2d6697324e6a167a135f0c060a38fb3d450da77bda9de60f207c8f2"} Feb 17 14:29:10 crc kubenswrapper[4762]: W0217 14:29:10.478338 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43ed625c_d879_4409_9450_d61b3f7cc686.slice/crio-62b54ba2b4cd99e389036c3aff56b3977f012dfd2c68c7897ad8870082c1d2dc WatchSource:0}: Error finding container 62b54ba2b4cd99e389036c3aff56b3977f012dfd2c68c7897ad8870082c1d2dc: Status 404 returned error can't find the container with id 62b54ba2b4cd99e389036c3aff56b3977f012dfd2c68c7897ad8870082c1d2dc Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.506841 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7wqqm"] Feb 17 14:29:10 crc kubenswrapper[4762]: W0217 14:29:10.523145 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b691b6d_c42b_491d_a1d0_3c5cb236598b.slice/crio-adf358d39541d880d0924153ccc59a4c5f63585ab12ad8b6177b84d1d5753470 WatchSource:0}: Error finding container adf358d39541d880d0924153ccc59a4c5f63585ab12ad8b6177b84d1d5753470: Status 404 returned error can't find the container with id adf358d39541d880d0924153ccc59a4c5f63585ab12ad8b6177b84d1d5753470 Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.523424 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"be4fc1fc3ae558b9bbce641c0184bcdf45d09631215b2036157559974e8aaf43"} Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.563478 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tt6cp" podStartSLOduration=4.720992807 podStartE2EDuration="27.563449232s" podCreationTimestamp="2026-02-17 14:28:43 +0000 UTC" firstStartedPulling="2026-02-17 14:28:44.550425955 +0000 UTC m=+1405.130426607" lastFinishedPulling="2026-02-17 14:29:07.39288238 +0000 UTC m=+1427.972883032" observedRunningTime="2026-02-17 14:29:10.527413913 +0000 UTC m=+1431.107414565" watchObservedRunningTime="2026-02-17 14:29:10.563449232 +0000 UTC m=+1431.143449874" Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.566188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrcjs" event={"ID":"93fb932d-6901-44d9-a508-a32692308154","Type":"ContainerStarted","Data":"70c203b5b567c8d0992e3550593c7ba4b7e1dfd0bc279bb3973333529fe62a0c"} Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.581866 4762 generic.go:334] "Generic (PLEG): container finished" podID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerID="3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba" exitCode=0 Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.582592 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerDied","Data":"3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba"} Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.741699 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q6l4w"] Feb 17 14:29:10 crc kubenswrapper[4762]: I0217 14:29:10.798534 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:29:10 crc kubenswrapper[4762]: W0217 14:29:10.852190 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad07381_6a78_4418_b451_0521ee7d95f9.slice/crio-42f3dcc214162581c714ea4b35d8c0d38858325bc19f5cf0e3e7c8711c4719a4 WatchSource:0}: Error finding container 42f3dcc214162581c714ea4b35d8c0d38858325bc19f5cf0e3e7c8711c4719a4: Status 404 returned error can't find the container with id 42f3dcc214162581c714ea4b35d8c0d38858325bc19f5cf0e3e7c8711c4719a4 Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.006145 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-be62-account-create-update-sl2zr"] Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.133115 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tvd94"] Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.222037 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b315-account-create-update-nnnmm"] Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.600025 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q6l4w" event={"ID":"8acf7e9f-6215-417b-b385-68b30decf4c8","Type":"ContainerStarted","Data":"0b35975ac9f3690990cbb5eb02794889182693f6b475e0c1eb88db555ec1f1f7"} Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.601439 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7wqqm" event={"ID":"3b691b6d-c42b-491d-a1d0-3c5cb236598b","Type":"ContainerStarted","Data":"adf358d39541d880d0924153ccc59a4c5f63585ab12ad8b6177b84d1d5753470"} Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.608556 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bad07381-6a78-4418-b451-0521ee7d95f9","Type":"ContainerStarted","Data":"42f3dcc214162581c714ea4b35d8c0d38858325bc19f5cf0e3e7c8711c4719a4"} Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.620917 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z944d" event={"ID":"d8300c70-e571-49c5-a403-d645237d7012","Type":"ContainerStarted","Data":"52c9fcb7745dc3768588a2eff3eac5d4d9c26668148f744f033828d29f11e00f"} Feb 17 14:29:11 crc kubenswrapper[4762]: I0217 14:29:11.624350 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8332-account-create-update-8vvzv" event={"ID":"43ed625c-d879-4409-9450-d61b3f7cc686","Type":"ContainerStarted","Data":"62b54ba2b4cd99e389036c3aff56b3977f012dfd2c68c7897ad8870082c1d2dc"} Feb 17 14:29:11 crc kubenswrapper[4762]: W0217 14:29:11.629055 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ad6e8de_6bb3_4a3e_b664_db44abab1875.slice/crio-f78f82f766b5523c933bdb5c0f5aa09adbd5874bd090b22e573e9eb3f4581167 WatchSource:0}: Error finding container f78f82f766b5523c933bdb5c0f5aa09adbd5874bd090b22e573e9eb3f4581167: Status 404 returned error can't find the container with id f78f82f766b5523c933bdb5c0f5aa09adbd5874bd090b22e573e9eb3f4581167 Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.543544 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563295 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run-ovn\") pod \"395d7b35-d540-4222-8009-d29b24d0f1be\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563448 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "395d7b35-d540-4222-8009-d29b24d0f1be" (UID: "395d7b35-d540-4222-8009-d29b24d0f1be"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563480 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-log-ovn\") pod \"395d7b35-d540-4222-8009-d29b24d0f1be\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563550 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trt2v\" (UniqueName: \"kubernetes.io/projected/395d7b35-d540-4222-8009-d29b24d0f1be-kube-api-access-trt2v\") pod \"395d7b35-d540-4222-8009-d29b24d0f1be\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563578 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-scripts\") pod \"395d7b35-d540-4222-8009-d29b24d0f1be\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563590 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "395d7b35-d540-4222-8009-d29b24d0f1be" (UID: "395d7b35-d540-4222-8009-d29b24d0f1be"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563685 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-additional-scripts\") pod \"395d7b35-d540-4222-8009-d29b24d0f1be\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563715 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run\") pod \"395d7b35-d540-4222-8009-d29b24d0f1be\" (UID: \"395d7b35-d540-4222-8009-d29b24d0f1be\") " Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.563987 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run" (OuterVolumeSpecName: "var-run") pod "395d7b35-d540-4222-8009-d29b24d0f1be" (UID: "395d7b35-d540-4222-8009-d29b24d0f1be"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.564452 4762 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.564481 4762 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.564493 4762 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/395d7b35-d540-4222-8009-d29b24d0f1be-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.564523 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "395d7b35-d540-4222-8009-d29b24d0f1be" (UID: "395d7b35-d540-4222-8009-d29b24d0f1be"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.564808 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-scripts" (OuterVolumeSpecName: "scripts") pod "395d7b35-d540-4222-8009-d29b24d0f1be" (UID: "395d7b35-d540-4222-8009-d29b24d0f1be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.575415 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395d7b35-d540-4222-8009-d29b24d0f1be-kube-api-access-trt2v" (OuterVolumeSpecName: "kube-api-access-trt2v") pod "395d7b35-d540-4222-8009-d29b24d0f1be" (UID: "395d7b35-d540-4222-8009-d29b24d0f1be"). InnerVolumeSpecName "kube-api-access-trt2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.659179 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8332-account-create-update-8vvzv" event={"ID":"43ed625c-d879-4409-9450-d61b3f7cc686","Type":"ContainerStarted","Data":"7b78434d42294952137d4e9b42996fd1d92e1096fa03ab5d7c829ec188c416fa"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.665686 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tvd94" event={"ID":"7220a0cb-7e9b-4648-ae3c-3289c1aa3493","Type":"ContainerStarted","Data":"b89fd92eb8a368b84e6a672c76e39069e38c02895857ae1e77aa283881d886ed"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.665731 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tvd94" event={"ID":"7220a0cb-7e9b-4648-ae3c-3289c1aa3493","Type":"ContainerStarted","Data":"5bc44e17f8c7431a0b7d2e7b57fb305c5680f807bd256f7486001524ab363d64"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.667526 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trt2v\" (UniqueName: \"kubernetes.io/projected/395d7b35-d540-4222-8009-d29b24d0f1be-kube-api-access-trt2v\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.667571 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.667584 4762 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/395d7b35-d540-4222-8009-d29b24d0f1be-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.677876 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z944d" event={"ID":"d8300c70-e571-49c5-a403-d645237d7012","Type":"ContainerStarted","Data":"01cf411bdaa952701750a9df2a25a47608282543566e90ccf00178957239f1ce"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.693738 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-8332-account-create-update-8vvzv" podStartSLOduration=4.693711374 podStartE2EDuration="4.693711374s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.68029355 +0000 UTC m=+1433.260294202" watchObservedRunningTime="2026-02-17 14:29:12.693711374 +0000 UTC m=+1433.273712026" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.703121 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-z944d" podStartSLOduration=4.703089919 podStartE2EDuration="4.703089919s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.696745367 +0000 UTC m=+1433.276746019" watchObservedRunningTime="2026-02-17 14:29:12.703089919 +0000 UTC m=+1433.283090571" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.712547 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b315-account-create-update-nnnmm" event={"ID":"8ad6e8de-6bb3-4a3e-b664-db44abab1875","Type":"ContainerStarted","Data":"3dcc57905933c53b081cbe5b6724219a68df8eca2edf14101a8004213f41dd23"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.712608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b315-account-create-update-nnnmm" event={"ID":"8ad6e8de-6bb3-4a3e-b664-db44abab1875","Type":"ContainerStarted","Data":"f78f82f766b5523c933bdb5c0f5aa09adbd5874bd090b22e573e9eb3f4581167"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.715006 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xspft-config-vz647" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.715008 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xspft-config-vz647" event={"ID":"395d7b35-d540-4222-8009-d29b24d0f1be","Type":"ContainerDied","Data":"59d6d905296e1e83e9b050621d5beb1ae67987367605c68ad0ac3a55769740b4"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.715118 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59d6d905296e1e83e9b050621d5beb1ae67987367605c68ad0ac3a55769740b4" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.719122 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a355-account-create-update-wzz5t" event={"ID":"ee986585-bdb5-4bed-8002-7cf0a80784a8","Type":"ContainerStarted","Data":"9f1ce5996958f9dc7ad6f6950a8991ff22e19800bb34ab246870e6e484d2caab"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.723492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7wqqm" event={"ID":"3b691b6d-c42b-491d-a1d0-3c5cb236598b","Type":"ContainerStarted","Data":"f76f0a45f4c784522da9919e5d767233cb61dece1943b8b5e5308eda5839e74e"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.738649 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-tvd94" podStartSLOduration=4.738589414 podStartE2EDuration="4.738589414s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.716464523 +0000 UTC m=+1433.296465175" watchObservedRunningTime="2026-02-17 14:29:12.738589414 +0000 UTC m=+1433.318590066" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.763622 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b315-account-create-update-nnnmm" podStartSLOduration=4.763594283 podStartE2EDuration="4.763594283s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.745051939 +0000 UTC m=+1433.325052611" watchObservedRunningTime="2026-02-17 14:29:12.763594283 +0000 UTC m=+1433.343594935" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.763985 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"64368c72122ad3f89c0f24723879f29182192bc63b1378caa4cab7c75ec86f22"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.773060 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrcjs" event={"ID":"93fb932d-6901-44d9-a508-a32692308154","Type":"ContainerStarted","Data":"bbd66e54a094fa112a253b7ef7051fb419564765ab4f001b118d257c18b4e927"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.780386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be62-account-create-update-sl2zr" event={"ID":"cb3e6eca-01ec-4a72-b83c-80183169dbf1","Type":"ContainerStarted","Data":"6705dec66fd79dde4dbcc153b9f177713ac34f9c71bcb883d6b9433d01f8d9be"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.780452 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be62-account-create-update-sl2zr" event={"ID":"cb3e6eca-01ec-4a72-b83c-80183169dbf1","Type":"ContainerStarted","Data":"cb8dfa15e8b674b1308cf9f18737c57ea2aac72e884995a2f737b5b6ffa62fa0"} Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.789437 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-7wqqm" podStartSLOduration=4.789407864 podStartE2EDuration="4.789407864s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.760483868 +0000 UTC m=+1433.340484520" watchObservedRunningTime="2026-02-17 14:29:12.789407864 +0000 UTC m=+1433.369408516" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.801217 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-a355-account-create-update-wzz5t" podStartSLOduration=4.801187574 podStartE2EDuration="4.801187574s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.778158069 +0000 UTC m=+1433.358158721" watchObservedRunningTime="2026-02-17 14:29:12.801187574 +0000 UTC m=+1433.381188226" Feb 17 14:29:12 crc kubenswrapper[4762]: I0217 14:29:12.826336 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-be62-account-create-update-sl2zr" podStartSLOduration=4.826307417 podStartE2EDuration="4.826307417s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:12.813290393 +0000 UTC m=+1433.393291055" watchObservedRunningTime="2026-02-17 14:29:12.826307417 +0000 UTC m=+1433.406308069" Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.822302 4762 generic.go:334] "Generic (PLEG): container finished" podID="3b691b6d-c42b-491d-a1d0-3c5cb236598b" containerID="f76f0a45f4c784522da9919e5d767233cb61dece1943b8b5e5308eda5839e74e" exitCode=0 Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.822802 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7wqqm" event={"ID":"3b691b6d-c42b-491d-a1d0-3c5cb236598b","Type":"ContainerDied","Data":"f76f0a45f4c784522da9919e5d767233cb61dece1943b8b5e5308eda5839e74e"} Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.832991 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xspft-config-vz647"] Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.836013 4762 generic.go:334] "Generic (PLEG): container finished" podID="93fb932d-6901-44d9-a508-a32692308154" containerID="bbd66e54a094fa112a253b7ef7051fb419564765ab4f001b118d257c18b4e927" exitCode=0 Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.836325 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrcjs" event={"ID":"93fb932d-6901-44d9-a508-a32692308154","Type":"ContainerDied","Data":"bbd66e54a094fa112a253b7ef7051fb419564765ab4f001b118d257c18b4e927"} Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.847908 4762 generic.go:334] "Generic (PLEG): container finished" podID="ee986585-bdb5-4bed-8002-7cf0a80784a8" containerID="9f1ce5996958f9dc7ad6f6950a8991ff22e19800bb34ab246870e6e484d2caab" exitCode=0 Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.848000 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a355-account-create-update-wzz5t" event={"ID":"ee986585-bdb5-4bed-8002-7cf0a80784a8","Type":"ContainerDied","Data":"9f1ce5996958f9dc7ad6f6950a8991ff22e19800bb34ab246870e6e484d2caab"} Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.856698 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xspft-config-vz647"] Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.865251 4762 generic.go:334] "Generic (PLEG): container finished" podID="e8bc1c0d-6392-40df-a3e9-3800d78b8a46" containerID="630e37dab7f019f6a2702f87903daaf8a2d343b5f5d4e2a8a3d76495731261c0" exitCode=0 Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.865386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvgmb" event={"ID":"e8bc1c0d-6392-40df-a3e9-3800d78b8a46","Type":"ContainerDied","Data":"630e37dab7f019f6a2702f87903daaf8a2d343b5f5d4e2a8a3d76495731261c0"} Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.896886 4762 generic.go:334] "Generic (PLEG): container finished" podID="43ed625c-d879-4409-9450-d61b3f7cc686" containerID="7b78434d42294952137d4e9b42996fd1d92e1096fa03ab5d7c829ec188c416fa" exitCode=0 Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.896981 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8332-account-create-update-8vvzv" event={"ID":"43ed625c-d879-4409-9450-d61b3f7cc686","Type":"ContainerDied","Data":"7b78434d42294952137d4e9b42996fd1d92e1096fa03ab5d7c829ec188c416fa"} Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.930800 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerStarted","Data":"61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a"} Feb 17 14:29:13 crc kubenswrapper[4762]: I0217 14:29:13.995485 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-66nlq" podStartSLOduration=5.369348793 podStartE2EDuration="20.995466248s" podCreationTimestamp="2026-02-17 14:28:53 +0000 UTC" firstStartedPulling="2026-02-17 14:28:56.904861779 +0000 UTC m=+1417.484862431" lastFinishedPulling="2026-02-17 14:29:12.530979234 +0000 UTC m=+1433.110979886" observedRunningTime="2026-02-17 14:29:13.993288688 +0000 UTC m=+1434.573289350" watchObservedRunningTime="2026-02-17 14:29:13.995466248 +0000 UTC m=+1434.575466900" Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.000983 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"3d6ab9069338178b07d3daade657b799a00d04258d016fa641feba9f3a3f60fa"} Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.023896 4762 generic.go:334] "Generic (PLEG): container finished" podID="7220a0cb-7e9b-4648-ae3c-3289c1aa3493" containerID="b89fd92eb8a368b84e6a672c76e39069e38c02895857ae1e77aa283881d886ed" exitCode=0 Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.023983 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tvd94" event={"ID":"7220a0cb-7e9b-4648-ae3c-3289c1aa3493","Type":"ContainerDied","Data":"b89fd92eb8a368b84e6a672c76e39069e38c02895857ae1e77aa283881d886ed"} Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.039308 4762 generic.go:334] "Generic (PLEG): container finished" podID="d8300c70-e571-49c5-a403-d645237d7012" containerID="01cf411bdaa952701750a9df2a25a47608282543566e90ccf00178957239f1ce" exitCode=0 Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.039398 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z944d" event={"ID":"d8300c70-e571-49c5-a403-d645237d7012","Type":"ContainerDied","Data":"01cf411bdaa952701750a9df2a25a47608282543566e90ccf00178957239f1ce"} Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.065054 4762 generic.go:334] "Generic (PLEG): container finished" podID="8ad6e8de-6bb3-4a3e-b664-db44abab1875" containerID="3dcc57905933c53b081cbe5b6724219a68df8eca2edf14101a8004213f41dd23" exitCode=0 Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.065155 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b315-account-create-update-nnnmm" event={"ID":"8ad6e8de-6bb3-4a3e-b664-db44abab1875","Type":"ContainerDied","Data":"3dcc57905933c53b081cbe5b6724219a68df8eca2edf14101a8004213f41dd23"} Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.073082 4762 generic.go:334] "Generic (PLEG): container finished" podID="cb3e6eca-01ec-4a72-b83c-80183169dbf1" containerID="6705dec66fd79dde4dbcc153b9f177713ac34f9c71bcb883d6b9433d01f8d9be" exitCode=0 Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.111564 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395d7b35-d540-4222-8009-d29b24d0f1be" path="/var/lib/kubelet/pods/395d7b35-d540-4222-8009-d29b24d0f1be/volumes" Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.112684 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be62-account-create-update-sl2zr" event={"ID":"cb3e6eca-01ec-4a72-b83c-80183169dbf1","Type":"ContainerDied","Data":"6705dec66fd79dde4dbcc153b9f177713ac34f9c71bcb883d6b9433d01f8d9be"} Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.112855 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.112929 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:29:14 crc kubenswrapper[4762]: I0217 14:29:14.955329 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.090928 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrcjs" event={"ID":"93fb932d-6901-44d9-a508-a32692308154","Type":"ContainerDied","Data":"70c203b5b567c8d0992e3550593c7ba4b7e1dfd0bc279bb3973333529fe62a0c"} Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.090963 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrcjs" Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.090976 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70c203b5b567c8d0992e3550593c7ba4b7e1dfd0bc279bb3973333529fe62a0c" Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.097324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"0a16f3ded347a1d801e464e504ee74febfd33afedde43f8aaa60cafed6c2e1c2"} Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.115016 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93fb932d-6901-44d9-a508-a32692308154-operator-scripts\") pod \"93fb932d-6901-44d9-a508-a32692308154\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.115111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5gbq\" (UniqueName: \"kubernetes.io/projected/93fb932d-6901-44d9-a508-a32692308154-kube-api-access-l5gbq\") pod \"93fb932d-6901-44d9-a508-a32692308154\" (UID: \"93fb932d-6901-44d9-a508-a32692308154\") " Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.117113 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93fb932d-6901-44d9-a508-a32692308154-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93fb932d-6901-44d9-a508-a32692308154" (UID: "93fb932d-6901-44d9-a508-a32692308154"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.285637 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93fb932d-6901-44d9-a508-a32692308154-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.298905 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-66nlq" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="registry-server" probeResult="failure" output=< Feb 17 14:29:15 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:29:15 crc kubenswrapper[4762]: > Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.301199 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fb932d-6901-44d9-a508-a32692308154-kube-api-access-l5gbq" (OuterVolumeSpecName: "kube-api-access-l5gbq") pod "93fb932d-6901-44d9-a508-a32692308154" (UID: "93fb932d-6901-44d9-a508-a32692308154"). InnerVolumeSpecName "kube-api-access-l5gbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4762]: I0217 14:29:15.387439 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5gbq\" (UniqueName: \"kubernetes.io/projected/93fb932d-6901-44d9-a508-a32692308154-kube-api-access-l5gbq\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:16 crc kubenswrapper[4762]: I0217 14:29:16.117040 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bad07381-6a78-4418-b451-0521ee7d95f9","Type":"ContainerStarted","Data":"eda36daa74903f3a9ca7ed20707f4b1c0301c43a2da2ba7419cd56668b24c592"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.166363 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.166374 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7wqqm" event={"ID":"3b691b6d-c42b-491d-a1d0-3c5cb236598b","Type":"ContainerDied","Data":"adf358d39541d880d0924153ccc59a4c5f63585ab12ad8b6177b84d1d5753470"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.167301 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf358d39541d880d0924153ccc59a4c5f63585ab12ad8b6177b84d1d5753470" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.178200 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tvd94" event={"ID":"7220a0cb-7e9b-4648-ae3c-3289c1aa3493","Type":"ContainerDied","Data":"5bc44e17f8c7431a0b7d2e7b57fb305c5680f807bd256f7486001524ab363d64"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.178238 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bc44e17f8c7431a0b7d2e7b57fb305c5680f807bd256f7486001524ab363d64" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.180318 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z944d" event={"ID":"d8300c70-e571-49c5-a403-d645237d7012","Type":"ContainerDied","Data":"52c9fcb7745dc3768588a2eff3eac5d4d9c26668148f744f033828d29f11e00f"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.180421 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c9fcb7745dc3768588a2eff3eac5d4d9c26668148f744f033828d29f11e00f" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.181967 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b315-account-create-update-nnnmm" event={"ID":"8ad6e8de-6bb3-4a3e-b664-db44abab1875","Type":"ContainerDied","Data":"f78f82f766b5523c933bdb5c0f5aa09adbd5874bd090b22e573e9eb3f4581167"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.181996 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f78f82f766b5523c933bdb5c0f5aa09adbd5874bd090b22e573e9eb3f4581167" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.181976 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b315-account-create-update-nnnmm" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.183143 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-be62-account-create-update-sl2zr" event={"ID":"cb3e6eca-01ec-4a72-b83c-80183169dbf1","Type":"ContainerDied","Data":"cb8dfa15e8b674b1308cf9f18737c57ea2aac72e884995a2f737b5b6ffa62fa0"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.183174 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb8dfa15e8b674b1308cf9f18737c57ea2aac72e884995a2f737b5b6ffa62fa0" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.184775 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8332-account-create-update-8vvzv" event={"ID":"43ed625c-d879-4409-9450-d61b3f7cc686","Type":"ContainerDied","Data":"62b54ba2b4cd99e389036c3aff56b3977f012dfd2c68c7897ad8870082c1d2dc"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.184942 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b54ba2b4cd99e389036c3aff56b3977f012dfd2c68c7897ad8870082c1d2dc" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.186587 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a355-account-create-update-wzz5t" event={"ID":"ee986585-bdb5-4bed-8002-7cf0a80784a8","Type":"ContainerDied","Data":"cfc8636ca3d1e76630efb7f349669e7256877bcdb250734a9da6cba02cccc616"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.186720 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfc8636ca3d1e76630efb7f349669e7256877bcdb250734a9da6cba02cccc616" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.198964 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zvgmb" event={"ID":"e8bc1c0d-6392-40df-a3e9-3800d78b8a46","Type":"ContainerDied","Data":"0f6b07184d4ec7e0e77f730fdacc51fe2d3c82739f93a456f57cbec130722f4f"} Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.199006 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6b07184d4ec7e0e77f730fdacc51fe2d3c82739f93a456f57cbec130722f4f" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.201430 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z944d" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.233909 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8300c70-e571-49c5-a403-d645237d7012-operator-scripts\") pod \"d8300c70-e571-49c5-a403-d645237d7012\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.234321 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glqdv\" (UniqueName: \"kubernetes.io/projected/8ad6e8de-6bb3-4a3e-b664-db44abab1875-kube-api-access-glqdv\") pod \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.234490 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc86k\" (UniqueName: \"kubernetes.io/projected/d8300c70-e571-49c5-a403-d645237d7012-kube-api-access-mc86k\") pod \"d8300c70-e571-49c5-a403-d645237d7012\" (UID: \"d8300c70-e571-49c5-a403-d645237d7012\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.234678 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8300c70-e571-49c5-a403-d645237d7012-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8300c70-e571-49c5-a403-d645237d7012" (UID: "d8300c70-e571-49c5-a403-d645237d7012"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.234919 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad6e8de-6bb3-4a3e-b664-db44abab1875-operator-scripts\") pod \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\" (UID: \"8ad6e8de-6bb3-4a3e-b664-db44abab1875\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.236606 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8300c70-e571-49c5-a403-d645237d7012-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.237857 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad6e8de-6bb3-4a3e-b664-db44abab1875-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ad6e8de-6bb3-4a3e-b664-db44abab1875" (UID: "8ad6e8de-6bb3-4a3e-b664-db44abab1875"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.241514 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.243412 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8300c70-e571-49c5-a403-d645237d7012-kube-api-access-mc86k" (OuterVolumeSpecName: "kube-api-access-mc86k") pod "d8300c70-e571-49c5-a403-d645237d7012" (UID: "d8300c70-e571-49c5-a403-d645237d7012"). InnerVolumeSpecName "kube-api-access-mc86k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.243574 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad6e8de-6bb3-4a3e-b664-db44abab1875-kube-api-access-glqdv" (OuterVolumeSpecName: "kube-api-access-glqdv") pod "8ad6e8de-6bb3-4a3e-b664-db44abab1875" (UID: "8ad6e8de-6bb3-4a3e-b664-db44abab1875"). InnerVolumeSpecName "kube-api-access-glqdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.250190 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.339738 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-operator-scripts\") pod \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340005 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzrnp\" (UniqueName: \"kubernetes.io/projected/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-kube-api-access-wzrnp\") pod \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\" (UID: \"7220a0cb-7e9b-4648-ae3c-3289c1aa3493\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340033 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2jb\" (UniqueName: \"kubernetes.io/projected/ee986585-bdb5-4bed-8002-7cf0a80784a8-kube-api-access-xn2jb\") pod \"ee986585-bdb5-4bed-8002-7cf0a80784a8\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340132 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee986585-bdb5-4bed-8002-7cf0a80784a8-operator-scripts\") pod \"ee986585-bdb5-4bed-8002-7cf0a80784a8\" (UID: \"ee986585-bdb5-4bed-8002-7cf0a80784a8\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340294 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7220a0cb-7e9b-4648-ae3c-3289c1aa3493" (UID: "7220a0cb-7e9b-4648-ae3c-3289c1aa3493"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340684 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340710 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad6e8de-6bb3-4a3e-b664-db44abab1875-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340721 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glqdv\" (UniqueName: \"kubernetes.io/projected/8ad6e8de-6bb3-4a3e-b664-db44abab1875-kube-api-access-glqdv\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.340733 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc86k\" (UniqueName: \"kubernetes.io/projected/d8300c70-e571-49c5-a403-d645237d7012-kube-api-access-mc86k\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.345232 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee986585-bdb5-4bed-8002-7cf0a80784a8-kube-api-access-xn2jb" (OuterVolumeSpecName: "kube-api-access-xn2jb") pod "ee986585-bdb5-4bed-8002-7cf0a80784a8" (UID: "ee986585-bdb5-4bed-8002-7cf0a80784a8"). InnerVolumeSpecName "kube-api-access-xn2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.345287 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-kube-api-access-wzrnp" (OuterVolumeSpecName: "kube-api-access-wzrnp") pod "7220a0cb-7e9b-4648-ae3c-3289c1aa3493" (UID: "7220a0cb-7e9b-4648-ae3c-3289c1aa3493"). InnerVolumeSpecName "kube-api-access-wzrnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.348292 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee986585-bdb5-4bed-8002-7cf0a80784a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee986585-bdb5-4bed-8002-7cf0a80784a8" (UID: "ee986585-bdb5-4bed-8002-7cf0a80784a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.529918 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzrnp\" (UniqueName: \"kubernetes.io/projected/7220a0cb-7e9b-4648-ae3c-3289c1aa3493-kube-api-access-wzrnp\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.529960 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn2jb\" (UniqueName: \"kubernetes.io/projected/ee986585-bdb5-4bed-8002-7cf0a80784a8-kube-api-access-xn2jb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.529974 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee986585-bdb5-4bed-8002-7cf0a80784a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.569438 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvgmb" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.591235 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.598313 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.627262 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739050 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw6b6\" (UniqueName: \"kubernetes.io/projected/3b691b6d-c42b-491d-a1d0-3c5cb236598b-kube-api-access-rw6b6\") pod \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739426 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-operator-scripts\") pod \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739533 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65ssz\" (UniqueName: \"kubernetes.io/projected/43ed625c-d879-4409-9450-d61b3f7cc686-kube-api-access-65ssz\") pod \"43ed625c-d879-4409-9450-d61b3f7cc686\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739590 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqt5m\" (UniqueName: \"kubernetes.io/projected/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-kube-api-access-mqt5m\") pod \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\" (UID: \"e8bc1c0d-6392-40df-a3e9-3800d78b8a46\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739676 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glbdz\" (UniqueName: \"kubernetes.io/projected/cb3e6eca-01ec-4a72-b83c-80183169dbf1-kube-api-access-glbdz\") pod \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739710 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3e6eca-01ec-4a72-b83c-80183169dbf1-operator-scripts\") pod \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\" (UID: \"cb3e6eca-01ec-4a72-b83c-80183169dbf1\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739752 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43ed625c-d879-4409-9450-d61b3f7cc686-operator-scripts\") pod \"43ed625c-d879-4409-9450-d61b3f7cc686\" (UID: \"43ed625c-d879-4409-9450-d61b3f7cc686\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.739785 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b691b6d-c42b-491d-a1d0-3c5cb236598b-operator-scripts\") pod \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\" (UID: \"3b691b6d-c42b-491d-a1d0-3c5cb236598b\") " Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.740544 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8bc1c0d-6392-40df-a3e9-3800d78b8a46" (UID: "e8bc1c0d-6392-40df-a3e9-3800d78b8a46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.741055 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb3e6eca-01ec-4a72-b83c-80183169dbf1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb3e6eca-01ec-4a72-b83c-80183169dbf1" (UID: "cb3e6eca-01ec-4a72-b83c-80183169dbf1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.741422 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ed625c-d879-4409-9450-d61b3f7cc686-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43ed625c-d879-4409-9450-d61b3f7cc686" (UID: "43ed625c-d879-4409-9450-d61b3f7cc686"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.741452 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.741475 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3e6eca-01ec-4a72-b83c-80183169dbf1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.741830 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b691b6d-c42b-491d-a1d0-3c5cb236598b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b691b6d-c42b-491d-a1d0-3c5cb236598b" (UID: "3b691b6d-c42b-491d-a1d0-3c5cb236598b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.745049 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b691b6d-c42b-491d-a1d0-3c5cb236598b-kube-api-access-rw6b6" (OuterVolumeSpecName: "kube-api-access-rw6b6") pod "3b691b6d-c42b-491d-a1d0-3c5cb236598b" (UID: "3b691b6d-c42b-491d-a1d0-3c5cb236598b"). InnerVolumeSpecName "kube-api-access-rw6b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.745161 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3e6eca-01ec-4a72-b83c-80183169dbf1-kube-api-access-glbdz" (OuterVolumeSpecName: "kube-api-access-glbdz") pod "cb3e6eca-01ec-4a72-b83c-80183169dbf1" (UID: "cb3e6eca-01ec-4a72-b83c-80183169dbf1"). InnerVolumeSpecName "kube-api-access-glbdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.745197 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-kube-api-access-mqt5m" (OuterVolumeSpecName: "kube-api-access-mqt5m") pod "e8bc1c0d-6392-40df-a3e9-3800d78b8a46" (UID: "e8bc1c0d-6392-40df-a3e9-3800d78b8a46"). InnerVolumeSpecName "kube-api-access-mqt5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.746709 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ed625c-d879-4409-9450-d61b3f7cc686-kube-api-access-65ssz" (OuterVolumeSpecName: "kube-api-access-65ssz") pod "43ed625c-d879-4409-9450-d61b3f7cc686" (UID: "43ed625c-d879-4409-9450-d61b3f7cc686"). InnerVolumeSpecName "kube-api-access-65ssz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.844754 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65ssz\" (UniqueName: \"kubernetes.io/projected/43ed625c-d879-4409-9450-d61b3f7cc686-kube-api-access-65ssz\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.844792 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqt5m\" (UniqueName: \"kubernetes.io/projected/e8bc1c0d-6392-40df-a3e9-3800d78b8a46-kube-api-access-mqt5m\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.844806 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glbdz\" (UniqueName: \"kubernetes.io/projected/cb3e6eca-01ec-4a72-b83c-80183169dbf1-kube-api-access-glbdz\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.844822 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43ed625c-d879-4409-9450-d61b3f7cc686-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.844834 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b691b6d-c42b-491d-a1d0-3c5cb236598b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:20 crc kubenswrapper[4762]: I0217 14:29:20.844846 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw6b6\" (UniqueName: \"kubernetes.io/projected/3b691b6d-c42b-491d-a1d0-3c5cb236598b-kube-api-access-rw6b6\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.219497 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"6efcbcd2274eccb96639edc41a73e6e3d2d86ca8dd52a485a28e709684d5a280"} Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.219549 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"466a7dc3-63d2-4995-ab6f-712df183303d","Type":"ContainerStarted","Data":"bc17450e2d2ce6c9cd72a59982370d185c9cc1f39491abcf133678c4c0a87f6e"} Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222130 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7wqqm" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222161 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tvd94" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222204 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q6l4w" event={"ID":"8acf7e9f-6215-417b-b385-68b30decf4c8","Type":"ContainerStarted","Data":"34cc702e78165783238ac76fa93e6b1533c509faaf06d4e865695cada48f2d68"} Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222210 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8332-account-create-update-8vvzv" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222258 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-be62-account-create-update-sl2zr" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222274 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z944d" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222307 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a355-account-create-update-wzz5t" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.222355 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zvgmb" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.280199 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=48.479281417 podStartE2EDuration="1m10.280163776s" podCreationTimestamp="2026-02-17 14:28:11 +0000 UTC" firstStartedPulling="2026-02-17 14:28:45.590968153 +0000 UTC m=+1406.170968805" lastFinishedPulling="2026-02-17 14:29:07.391850512 +0000 UTC m=+1427.971851164" observedRunningTime="2026-02-17 14:29:21.264069149 +0000 UTC m=+1441.844069811" watchObservedRunningTime="2026-02-17 14:29:21.280163776 +0000 UTC m=+1441.860164418" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.316181 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-q6l4w" podStartSLOduration=4.215549574 podStartE2EDuration="13.316162254s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="2026-02-17 14:29:10.848846306 +0000 UTC m=+1431.428846958" lastFinishedPulling="2026-02-17 14:29:19.949458986 +0000 UTC m=+1440.529459638" observedRunningTime="2026-02-17 14:29:21.301261259 +0000 UTC m=+1441.881261901" watchObservedRunningTime="2026-02-17 14:29:21.316162254 +0000 UTC m=+1441.896162906" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.811332 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jzb4k"] Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.811899 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3e6eca-01ec-4a72-b83c-80183169dbf1" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.811927 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3e6eca-01ec-4a72-b83c-80183169dbf1" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.811956 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8300c70-e571-49c5-a403-d645237d7012" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.811967 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8300c70-e571-49c5-a403-d645237d7012" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.811991 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220a0cb-7e9b-4648-ae3c-3289c1aa3493" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.812000 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220a0cb-7e9b-4648-ae3c-3289c1aa3493" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.814801 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ed625c-d879-4409-9450-d61b3f7cc686" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.814820 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed625c-d879-4409-9450-d61b3f7cc686" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.814842 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee986585-bdb5-4bed-8002-7cf0a80784a8" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.814850 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee986585-bdb5-4bed-8002-7cf0a80784a8" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.814873 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad6e8de-6bb3-4a3e-b664-db44abab1875" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.814882 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad6e8de-6bb3-4a3e-b664-db44abab1875" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.814922 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fb932d-6901-44d9-a508-a32692308154" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.814931 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fb932d-6901-44d9-a508-a32692308154" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.814949 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b691b6d-c42b-491d-a1d0-3c5cb236598b" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.814958 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b691b6d-c42b-491d-a1d0-3c5cb236598b" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.814982 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bc1c0d-6392-40df-a3e9-3800d78b8a46" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.814992 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bc1c0d-6392-40df-a3e9-3800d78b8a46" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: E0217 14:29:21.815013 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395d7b35-d540-4222-8009-d29b24d0f1be" containerName="ovn-config" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815022 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d7b35-d540-4222-8009-d29b24d0f1be" containerName="ovn-config" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815402 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220a0cb-7e9b-4648-ae3c-3289c1aa3493" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815421 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fb932d-6901-44d9-a508-a32692308154" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815430 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3e6eca-01ec-4a72-b83c-80183169dbf1" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815447 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bc1c0d-6392-40df-a3e9-3800d78b8a46" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815457 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad6e8de-6bb3-4a3e-b664-db44abab1875" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815465 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="395d7b35-d540-4222-8009-d29b24d0f1be" containerName="ovn-config" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815475 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee986585-bdb5-4bed-8002-7cf0a80784a8" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815489 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ed625c-d879-4409-9450-d61b3f7cc686" containerName="mariadb-account-create-update" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815497 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8300c70-e571-49c5-a403-d645237d7012" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.815507 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b691b6d-c42b-491d-a1d0-3c5cb236598b" containerName="mariadb-database-create" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.816700 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.819469 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 14:29:21 crc kubenswrapper[4762]: I0217 14:29:21.830991 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jzb4k"] Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.018115 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.018181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlkxk\" (UniqueName: \"kubernetes.io/projected/aa77bfe8-fbc4-42c5-923a-2909909db58d-kube-api-access-zlkxk\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.018205 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.018455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.018702 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-config\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.018801 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.221143 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-config\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.221201 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.221246 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.221281 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlkxk\" (UniqueName: \"kubernetes.io/projected/aa77bfe8-fbc4-42c5-923a-2909909db58d-kube-api-access-zlkxk\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.221307 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.221379 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.233571 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.237268 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.237469 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.237922 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-config\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.239728 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.251597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlkxk\" (UniqueName: \"kubernetes.io/projected/aa77bfe8-fbc4-42c5-923a-2909909db58d-kube-api-access-zlkxk\") pod \"dnsmasq-dns-5c79d794d7-jzb4k\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.437537 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:22 crc kubenswrapper[4762]: I0217 14:29:22.803455 4762 scope.go:117] "RemoveContainer" containerID="c8fb48ad1878b5889f3ee2586929930c5c785db1918e85937bc99df92ef018b4" Feb 17 14:29:23 crc kubenswrapper[4762]: W0217 14:29:23.337432 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa77bfe8_fbc4_42c5_923a_2909909db58d.slice/crio-fc0a5e5909fdaccf8993028933f4575a3294d9c60458d2ec79a5bd712e094d46 WatchSource:0}: Error finding container fc0a5e5909fdaccf8993028933f4575a3294d9c60458d2ec79a5bd712e094d46: Status 404 returned error can't find the container with id fc0a5e5909fdaccf8993028933f4575a3294d9c60458d2ec79a5bd712e094d46 Feb 17 14:29:23 crc kubenswrapper[4762]: I0217 14:29:23.360773 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jzb4k"] Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.164413 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.236126 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.313143 4762 generic.go:334] "Generic (PLEG): container finished" podID="bad07381-6a78-4418-b451-0521ee7d95f9" containerID="eda36daa74903f3a9ca7ed20707f4b1c0301c43a2da2ba7419cd56668b24c592" exitCode=0 Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.313449 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bad07381-6a78-4418-b451-0521ee7d95f9","Type":"ContainerDied","Data":"eda36daa74903f3a9ca7ed20707f4b1c0301c43a2da2ba7419cd56668b24c592"} Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.315841 4762 generic.go:334] "Generic (PLEG): container finished" podID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerID="b8efb2c46c08b1153856a7affefe3521f37a0170301d64f770b195f1c329f359" exitCode=0 Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.315912 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" event={"ID":"aa77bfe8-fbc4-42c5-923a-2909909db58d","Type":"ContainerDied","Data":"b8efb2c46c08b1153856a7affefe3521f37a0170301d64f770b195f1c329f359"} Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.315932 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" event={"ID":"aa77bfe8-fbc4-42c5-923a-2909909db58d","Type":"ContainerStarted","Data":"fc0a5e5909fdaccf8993028933f4575a3294d9c60458d2ec79a5bd712e094d46"} Feb 17 14:29:24 crc kubenswrapper[4762]: I0217 14:29:24.419958 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66nlq"] Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.327610 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bad07381-6a78-4418-b451-0521ee7d95f9","Type":"ContainerStarted","Data":"f1a27be0b729d6cd14d5b4a9722009cf3f0220b0621a738bc0846398f46f133a"} Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.331958 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" event={"ID":"aa77bfe8-fbc4-42c5-923a-2909909db58d","Type":"ContainerStarted","Data":"31ca1341142a5a93a903a4b632666e572dc9639b7ed02f26803e5113e0b8521d"} Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.334703 4762 generic.go:334] "Generic (PLEG): container finished" podID="ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" containerID="6891113cf2d6697324e6a167a135f0c060a38fb3d450da77bda9de60f207c8f2" exitCode=0 Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.334969 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-66nlq" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="registry-server" containerID="cri-o://61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a" gracePeriod=2 Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.335087 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tt6cp" event={"ID":"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574","Type":"ContainerDied","Data":"6891113cf2d6697324e6a167a135f0c060a38fb3d450da77bda9de60f207c8f2"} Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.356184 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" podStartSLOduration=4.3561645460000005 podStartE2EDuration="4.356164546s" podCreationTimestamp="2026-02-17 14:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:25.34857246 +0000 UTC m=+1445.928573112" watchObservedRunningTime="2026-02-17 14:29:25.356164546 +0000 UTC m=+1445.936165208" Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.833201 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.836300 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-catalog-content\") pod \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.836615 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsgzw\" (UniqueName: \"kubernetes.io/projected/b1b8d793-bf38-4c87-8830-21b7dc5ad129-kube-api-access-wsgzw\") pod \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.836776 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-utilities\") pod \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\" (UID: \"b1b8d793-bf38-4c87-8830-21b7dc5ad129\") " Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.837710 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-utilities" (OuterVolumeSpecName: "utilities") pod "b1b8d793-bf38-4c87-8830-21b7dc5ad129" (UID: "b1b8d793-bf38-4c87-8830-21b7dc5ad129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.842928 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b8d793-bf38-4c87-8830-21b7dc5ad129-kube-api-access-wsgzw" (OuterVolumeSpecName: "kube-api-access-wsgzw") pod "b1b8d793-bf38-4c87-8830-21b7dc5ad129" (UID: "b1b8d793-bf38-4c87-8830-21b7dc5ad129"). InnerVolumeSpecName "kube-api-access-wsgzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.939107 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.939138 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsgzw\" (UniqueName: \"kubernetes.io/projected/b1b8d793-bf38-4c87-8830-21b7dc5ad129-kube-api-access-wsgzw\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:25 crc kubenswrapper[4762]: I0217 14:29:25.975101 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1b8d793-bf38-4c87-8830-21b7dc5ad129" (UID: "b1b8d793-bf38-4c87-8830-21b7dc5ad129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.042459 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b8d793-bf38-4c87-8830-21b7dc5ad129-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.358619 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zvgmb"] Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.361617 4762 generic.go:334] "Generic (PLEG): container finished" podID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerID="61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a" exitCode=0 Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.361704 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerDied","Data":"61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a"} Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.361733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66nlq" event={"ID":"b1b8d793-bf38-4c87-8830-21b7dc5ad129","Type":"ContainerDied","Data":"a891055456b5d44d8d88ca49c1b18f0a38ab368180609450008092bdb9761cc1"} Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.361750 4762 scope.go:117] "RemoveContainer" containerID="61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.361889 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66nlq" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.370568 4762 generic.go:334] "Generic (PLEG): container finished" podID="8acf7e9f-6215-417b-b385-68b30decf4c8" containerID="34cc702e78165783238ac76fa93e6b1533c509faaf06d4e865695cada48f2d68" exitCode=0 Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.370850 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q6l4w" event={"ID":"8acf7e9f-6215-417b-b385-68b30decf4c8","Type":"ContainerDied","Data":"34cc702e78165783238ac76fa93e6b1533c509faaf06d4e865695cada48f2d68"} Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.371262 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.380871 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zvgmb"] Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.413016 4762 scope.go:117] "RemoveContainer" containerID="3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.442722 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66nlq"] Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.496916 4762 scope.go:117] "RemoveContainer" containerID="35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.497257 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-66nlq"] Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.573120 4762 scope.go:117] "RemoveContainer" containerID="61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a" Feb 17 14:29:26 crc kubenswrapper[4762]: E0217 14:29:26.577285 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a\": container with ID starting with 61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a not found: ID does not exist" containerID="61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.577338 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a"} err="failed to get container status \"61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a\": rpc error: code = NotFound desc = could not find container \"61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a\": container with ID starting with 61f637cc48c650ac38248b6fd682a35339e88833d83e41df39db3f5c8b9ce55a not found: ID does not exist" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.577367 4762 scope.go:117] "RemoveContainer" containerID="3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba" Feb 17 14:29:26 crc kubenswrapper[4762]: E0217 14:29:26.577727 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba\": container with ID starting with 3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba not found: ID does not exist" containerID="3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.577749 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba"} err="failed to get container status \"3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba\": rpc error: code = NotFound desc = could not find container \"3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba\": container with ID starting with 3af5f1c2e3eae4af92513c633126b82507a2e15ad98d8cbca87de620b0da42ba not found: ID does not exist" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.577760 4762 scope.go:117] "RemoveContainer" containerID="35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d" Feb 17 14:29:26 crc kubenswrapper[4762]: E0217 14:29:26.577928 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d\": container with ID starting with 35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d not found: ID does not exist" containerID="35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d" Feb 17 14:29:26 crc kubenswrapper[4762]: I0217 14:29:26.577945 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d"} err="failed to get container status \"35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d\": rpc error: code = NotFound desc = could not find container \"35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d\": container with ID starting with 35c39c92c8eda0fbca3545c4988d24c3f444d23c969dbd0b5a7a8b220c7be24d not found: ID does not exist" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.386238 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tt6cp" event={"ID":"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574","Type":"ContainerDied","Data":"a8a8552e4bd0a4280ec3178c0314e6f76809e9713d6dffb2e53f1e6a110904e2"} Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.387031 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a8552e4bd0a4280ec3178c0314e6f76809e9713d6dffb2e53f1e6a110904e2" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.389430 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tt6cp" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.493552 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-config-data\") pod \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.493808 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvpj4\" (UniqueName: \"kubernetes.io/projected/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-kube-api-access-dvpj4\") pod \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.493834 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-combined-ca-bundle\") pod \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.493863 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-db-sync-config-data\") pod \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\" (UID: \"ddad90d3-b6d4-4a8c-82cd-883fcc0e0574\") " Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.507395 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-kube-api-access-dvpj4" (OuterVolumeSpecName: "kube-api-access-dvpj4") pod "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" (UID: "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574"). InnerVolumeSpecName "kube-api-access-dvpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.513714 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" (UID: "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.596329 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvpj4\" (UniqueName: \"kubernetes.io/projected/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-kube-api-access-dvpj4\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.596365 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.805322 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" (UID: "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:27 crc kubenswrapper[4762]: I0217 14:29:27.902462 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.009328 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-config-data" (OuterVolumeSpecName: "config-data") pod "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" (UID: "ddad90d3-b6d4-4a8c-82cd-883fcc0e0574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.085868 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" path="/var/lib/kubelet/pods/b1b8d793-bf38-4c87-8830-21b7dc5ad129/volumes" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.086977 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bc1c0d-6392-40df-a3e9-3800d78b8a46" path="/var/lib/kubelet/pods/e8bc1c0d-6392-40df-a3e9-3800d78b8a46/volumes" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.089897 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.110538 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.323129 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5fcm\" (UniqueName: \"kubernetes.io/projected/8acf7e9f-6215-417b-b385-68b30decf4c8-kube-api-access-n5fcm\") pod \"8acf7e9f-6215-417b-b385-68b30decf4c8\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.323298 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-combined-ca-bundle\") pod \"8acf7e9f-6215-417b-b385-68b30decf4c8\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.323476 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-config-data\") pod \"8acf7e9f-6215-417b-b385-68b30decf4c8\" (UID: \"8acf7e9f-6215-417b-b385-68b30decf4c8\") " Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.344390 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8acf7e9f-6215-417b-b385-68b30decf4c8-kube-api-access-n5fcm" (OuterVolumeSpecName: "kube-api-access-n5fcm") pod "8acf7e9f-6215-417b-b385-68b30decf4c8" (UID: "8acf7e9f-6215-417b-b385-68b30decf4c8"). InnerVolumeSpecName "kube-api-access-n5fcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.387395 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8acf7e9f-6215-417b-b385-68b30decf4c8" (UID: "8acf7e9f-6215-417b-b385-68b30decf4c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.402951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q6l4w" event={"ID":"8acf7e9f-6215-417b-b385-68b30decf4c8","Type":"ContainerDied","Data":"0b35975ac9f3690990cbb5eb02794889182693f6b475e0c1eb88db555ec1f1f7"} Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.403004 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b35975ac9f3690990cbb5eb02794889182693f6b475e0c1eb88db555ec1f1f7" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.403074 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q6l4w" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.409886 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tt6cp" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.416166 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bad07381-6a78-4418-b451-0521ee7d95f9","Type":"ContainerStarted","Data":"063a87d141e12e715a7c1051035267039d8dfae9bf3ed53c9481c4bbf6939c17"} Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.424446 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-config-data" (OuterVolumeSpecName: "config-data") pod "8acf7e9f-6215-417b-b385-68b30decf4c8" (UID: "8acf7e9f-6215-417b-b385-68b30decf4c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.425335 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.425366 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5fcm\" (UniqueName: \"kubernetes.io/projected/8acf7e9f-6215-417b-b385-68b30decf4c8-kube-api-access-n5fcm\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.425383 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8acf7e9f-6215-417b-b385-68b30decf4c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.879586 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jzb4k"] Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.880153 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="dnsmasq-dns" containerID="cri-o://31ca1341142a5a93a903a4b632666e572dc9639b7ed02f26803e5113e0b8521d" gracePeriod=10 Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.922714 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-llc75"] Feb 17 14:29:28 crc kubenswrapper[4762]: E0217 14:29:28.923742 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="registry-server" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.923759 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="registry-server" Feb 17 14:29:28 crc kubenswrapper[4762]: E0217 14:29:28.923783 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="extract-content" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.923789 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="extract-content" Feb 17 14:29:28 crc kubenswrapper[4762]: E0217 14:29:28.923801 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8acf7e9f-6215-417b-b385-68b30decf4c8" containerName="keystone-db-sync" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.923808 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8acf7e9f-6215-417b-b385-68b30decf4c8" containerName="keystone-db-sync" Feb 17 14:29:28 crc kubenswrapper[4762]: E0217 14:29:28.923819 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="extract-utilities" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.923825 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="extract-utilities" Feb 17 14:29:28 crc kubenswrapper[4762]: E0217 14:29:28.923850 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" containerName="glance-db-sync" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.923856 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" containerName="glance-db-sync" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.934435 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b8d793-bf38-4c87-8830-21b7dc5ad129" containerName="registry-server" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.934468 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8acf7e9f-6215-417b-b385-68b30decf4c8" containerName="keystone-db-sync" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.934491 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" containerName="glance-db-sync" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.935389 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.954767 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.955439 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.955622 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.955800 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jgkd7" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.956416 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 14:29:28 crc kubenswrapper[4762]: I0217 14:29:28.994319 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-llc75"] Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.037753 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-zxtc5"] Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.040188 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.069385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.069856 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-credential-keys\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.069917 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-config\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.069969 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-fernet-keys\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.069995 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-svc\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070030 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-combined-ca-bundle\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070066 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-scripts\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070143 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzgs9\" (UniqueName: \"kubernetes.io/projected/89bb3fe3-d9c4-4292-8a16-79abd3522621-kube-api-access-gzgs9\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070178 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfzn\" (UniqueName: \"kubernetes.io/projected/30a7292d-960b-40f9-8b50-e6150d20d2b1-kube-api-access-cbfzn\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070250 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-config-data\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.070272 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.097910 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-zxtc5"] Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173091 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzgs9\" (UniqueName: \"kubernetes.io/projected/89bb3fe3-d9c4-4292-8a16-79abd3522621-kube-api-access-gzgs9\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173153 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173209 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbfzn\" (UniqueName: \"kubernetes.io/projected/30a7292d-960b-40f9-8b50-e6150d20d2b1-kube-api-access-cbfzn\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173239 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-config-data\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173260 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173348 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-credential-keys\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173392 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-config\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173425 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-fernet-keys\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173451 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-svc\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-combined-ca-bundle\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.173512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-scripts\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.175150 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.175962 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.178022 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-config\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.178186 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.178243 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-h7qp8"] Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.178787 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-svc\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.179739 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.181488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-credential-keys\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.189079 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-combined-ca-bundle\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.191822 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-scripts\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.206448 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.215316 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-config-data\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.225336 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzgs9\" (UniqueName: \"kubernetes.io/projected/89bb3fe3-d9c4-4292-8a16-79abd3522621-kube-api-access-gzgs9\") pod \"dnsmasq-dns-5b868669f-zxtc5\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.225561 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-mhg26" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.225835 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-h7qp8"] Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.228804 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-fernet-keys\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.239176 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbfzn\" (UniqueName: \"kubernetes.io/projected/30a7292d-960b-40f9-8b50-e6150d20d2b1-kube-api-access-cbfzn\") pod \"keystone-bootstrap-llc75\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.278792 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4t2\" (UniqueName: \"kubernetes.io/projected/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-kube-api-access-wz4t2\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.279007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-config-data\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.279052 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-combined-ca-bundle\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.588901 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.590851 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.690849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4t2\" (UniqueName: \"kubernetes.io/projected/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-kube-api-access-wz4t2\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.691117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-config-data\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.691177 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-combined-ca-bundle\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.711465 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bad07381-6a78-4418-b451-0521ee7d95f9","Type":"ContainerStarted","Data":"0c7c9c9451691af5fcd7cf92a06be1c077bbebc86ca78b17e63036a6ebc1ae69"} Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.943847 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-config-data\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.944753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-combined-ca-bundle\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.968900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4t2\" (UniqueName: \"kubernetes.io/projected/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-kube-api-access-wz4t2\") pod \"heat-db-sync-h7qp8\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:29 crc kubenswrapper[4762]: I0217 14:29:29.974187 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-zxtc5"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.037833 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wtc2k"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.039582 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.044486 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wdfj6" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.044925 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.045301 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.148147 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-6jnwv"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.158112 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-95lkq"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.159397 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.161079 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.170007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-config\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.170129 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shf9n\" (UniqueName: \"kubernetes.io/projected/cc27563b-a5bb-4e82-a286-e0628e7c07b3-kube-api-access-shf9n\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.170245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-combined-ca-bundle\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.187046 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-95lkq"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.188507 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hcfzc" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.188562 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.197521 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.249661 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h7qp8" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.273900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-combined-ca-bundle\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.273965 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-combined-ca-bundle\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.273991 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-config\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274039 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-config\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274060 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-scripts\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274096 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgw7j\" (UniqueName: \"kubernetes.io/projected/0b031b2f-52a6-403f-a100-198a4edacc4b-kube-api-access-hgw7j\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274134 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274150 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrmjt\" (UniqueName: \"kubernetes.io/projected/d6ea0210-709e-4a47-87d1-48c811c0ab85-kube-api-access-lrmjt\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274166 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274226 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-config-data\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274243 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shf9n\" (UniqueName: \"kubernetes.io/projected/cc27563b-a5bb-4e82-a286-e0628e7c07b3-kube-api-access-shf9n\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274281 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-db-sync-config-data\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274297 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6ea0210-709e-4a47-87d1-48c811c0ab85-etc-machine-id\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274344 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.274374 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.288588 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-combined-ca-bundle\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.302047 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-6jnwv"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.307784 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-config\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.327932 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wtc2k"] Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.336163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shf9n\" (UniqueName: \"kubernetes.io/projected/cc27563b-a5bb-4e82-a286-e0628e7c07b3-kube-api-access-shf9n\") pod \"neutron-db-sync-wtc2k\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.343160 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.343132724 podStartE2EDuration="22.343132724s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:29.996882527 +0000 UTC m=+1450.576883189" watchObservedRunningTime="2026-02-17 14:29:30.343132724 +0000 UTC m=+1450.923133376" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.622469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:30 crc kubenswrapper[4762]: I0217 14:29:30.622545 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.622631 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-combined-ca-bundle\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.622678 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-config\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.624429 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.626702 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.626828 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.627290 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-scripts\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.627401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgw7j\" (UniqueName: \"kubernetes.io/projected/0b031b2f-52a6-403f-a100-198a4edacc4b-kube-api-access-hgw7j\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.627495 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.627547 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmjt\" (UniqueName: \"kubernetes.io/projected/d6ea0210-709e-4a47-87d1-48c811c0ab85-kube-api-access-lrmjt\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.627570 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.627849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-config-data\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.630099 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-config\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.630307 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.630342 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-db-sync-config-data\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.630392 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6ea0210-709e-4a47-87d1-48c811c0ab85-etc-machine-id\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.630675 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6ea0210-709e-4a47-87d1-48c811c0ab85-etc-machine-id\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.631421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.636782 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-scripts\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.637975 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-config-data\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.639661 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-6jnwv"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.640316 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-combined-ca-bundle\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.646106 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-db-sync-config-data\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.656893 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-lq7n6"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.660141 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.679838 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-smktq"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.681376 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.701839 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-smktq"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.707709 4762 patch_prober.go:28] interesting pod/logging-loki-gateway-78d96f4c68-9bhm5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": context deadline exceeded" start-of-body= Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.707782 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-78d96f4c68-9bhm5" podUID="a4bee09c-f081-4ca0-aef8-40effbd263dd" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": context deadline exceeded" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.713009 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.713109 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.713114 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sf2vs" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.715777 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-clgpv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.716037 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.716159 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lq7n6"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.729421 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-2pthv"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.735189 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-config-data\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.735212 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.735247 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-logs\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.738166 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-db-sync-config-data\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.738296 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6n47\" (UniqueName: \"kubernetes.io/projected/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-kube-api-access-t6n47\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.738339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-combined-ca-bundle\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.738369 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7snz\" (UniqueName: \"kubernetes.io/projected/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-kube-api-access-w7snz\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.738608 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-scripts\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.738716 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-combined-ca-bundle\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.761309 4762 generic.go:334] "Generic (PLEG): container finished" podID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerID="31ca1341142a5a93a903a4b632666e572dc9639b7ed02f26803e5113e0b8521d" exitCode=0 Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.762927 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" event={"ID":"aa77bfe8-fbc4-42c5-923a-2909909db58d","Type":"ContainerDied","Data":"31ca1341142a5a93a903a4b632666e572dc9639b7ed02f26803e5113e0b8521d"} Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.764473 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-2pthv"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.771549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmjt\" (UniqueName: \"kubernetes.io/projected/d6ea0210-709e-4a47-87d1-48c811c0ab85-kube-api-access-lrmjt\") pod \"cinder-db-sync-95lkq\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.789465 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.796170 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.797075 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgw7j\" (UniqueName: \"kubernetes.io/projected/0b031b2f-52a6-403f-a100-198a4edacc4b-kube-api-access-hgw7j\") pod \"dnsmasq-dns-bbf5cc879-6jnwv\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.805981 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.806243 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.810170 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.813254 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ckfnj" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.832618 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-config\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842344 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-db-sync-config-data\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842374 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842447 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6n47\" (UniqueName: \"kubernetes.io/projected/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-kube-api-access-t6n47\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842477 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-combined-ca-bundle\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842507 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7snz\" (UniqueName: \"kubernetes.io/projected/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-kube-api-access-w7snz\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842631 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6x2\" (UniqueName: \"kubernetes.io/projected/6fe335f8-8a53-40c6-99ca-d106d01d65f5-kube-api-access-rs6x2\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-scripts\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842746 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842780 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-combined-ca-bundle\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842832 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842859 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842888 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-config-data\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.842923 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-logs\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.843542 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-logs\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.843784 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.858180 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.858519 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.882223 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-scripts\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:30.898684 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-combined-ca-bundle\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.156489 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-combined-ca-bundle\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.156969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6x2\" (UniqueName: \"kubernetes.io/projected/6fe335f8-8a53-40c6-99ca-d106d01d65f5-kube-api-access-rs6x2\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.157080 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.157104 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.157129 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-logs\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.157174 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.157164 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-db-sync-config-data\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.158007 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-config-data\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.158310 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.157191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.158554 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565k4\" (UniqueName: \"kubernetes.io/projected/ae775d0e-8b93-454c-bbd5-6c06937759dd-kube-api-access-565k4\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.158589 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.158671 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-config\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.159058 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.159102 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.159109 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.159211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.159375 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.159770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.160313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-config\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.160995 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.200667 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.229969 4762 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded" start-of-body= Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.231928 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f7a72999-d771-4b3e-ba91-38078274aa35" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.232044 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bd9n7" podUID="eb14da33-81db-4b59-8325-af90620744fe" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.94:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.239791 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6n47\" (UniqueName: \"kubernetes.io/projected/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-kube-api-access-t6n47\") pod \"barbican-db-sync-smktq\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.243484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6x2\" (UniqueName: \"kubernetes.io/projected/6fe335f8-8a53-40c6-99ca-d106d01d65f5-kube-api-access-rs6x2\") pod \"dnsmasq-dns-56df8fb6b7-2pthv\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.245130 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7snz\" (UniqueName: \"kubernetes.io/projected/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-kube-api-access-w7snz\") pod \"placement-db-sync-lq7n6\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.261712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565k4\" (UniqueName: \"kubernetes.io/projected/ae775d0e-8b93-454c-bbd5-6c06937759dd-kube-api-access-565k4\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.261771 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.261848 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.261883 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.261938 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.262068 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.262094 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-logs\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.262873 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-logs\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.265075 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.270304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.276417 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.289938 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.320278 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565k4\" (UniqueName: \"kubernetes.io/projected/ae775d0e-8b93-454c-bbd5-6c06937759dd-kube-api-access-565k4\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369215 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-config-data\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369303 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369515 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-kube-api-access-s6z6j\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369840 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-scripts\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.369957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.808153 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-scripts\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.820527 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.821117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-config-data\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.821493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.821662 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.821826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.821961 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-kube-api-access-s6z6j\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.895817 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-scripts\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.810713 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.896205 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd98bc01ad401fb0843a9dd71ca408e41c0fbbffed1920afb8717f05abdffdd4/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.897344 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.923808 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-config-data\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.946204 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.948462 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-95lkq" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.950578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.953802 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-kube-api-access-s6z6j\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:31 crc kubenswrapper[4762]: I0217 14:29:31.958249 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " pod="openstack/ceilometer-0" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.400290 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.417086 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lq7n6" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.478191 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lq7w9"] Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.520264 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lq7w9"] Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.520422 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.537117 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.621957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtlvt\" (UniqueName: \"kubernetes.io/projected/16658e34-885b-4693-9784-bd985a6acd52-kube-api-access-mtlvt\") pod \"root-account-create-update-lq7w9\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.622112 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16658e34-885b-4693-9784-bd985a6acd52-operator-scripts\") pod \"root-account-create-update-lq7w9\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.978227 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlvt\" (UniqueName: \"kubernetes.io/projected/16658e34-885b-4693-9784-bd985a6acd52-kube-api-access-mtlvt\") pod \"root-account-create-update-lq7w9\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.978319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16658e34-885b-4693-9784-bd985a6acd52-operator-scripts\") pod \"root-account-create-update-lq7w9\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.978394 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-zxtc5"] Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.991946 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16658e34-885b-4693-9784-bd985a6acd52-operator-scripts\") pod \"root-account-create-update-lq7w9\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:32 crc kubenswrapper[4762]: I0217 14:29:32.994305 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.056215 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-llc75"] Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.085165 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.090836 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.091507 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtlvt\" (UniqueName: \"kubernetes.io/projected/16658e34-885b-4693-9784-bd985a6acd52-kube-api-access-mtlvt\") pod \"root-account-create-update-lq7w9\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.094743 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.126970 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.417912 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.418391 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.418461 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.418517 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fth\" (UniqueName: \"kubernetes.io/projected/6a776482-53fb-409c-a62b-22f41749eb7b-kube-api-access-n6fth\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.418658 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.418691 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.418762 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.446755 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llc75" event={"ID":"30a7292d-960b-40f9-8b50-e6150d20d2b1","Type":"ContainerStarted","Data":"fde7764079b617e83424cfb0e79b752fd42252aeb6507a14b6e386d7331c4302"} Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.447770 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" event={"ID":"89bb3fe3-d9c4-4292-8a16-79abd3522621","Type":"ContainerStarted","Data":"d5fdfb9d0075f1ae3a13a13ff5c2e9c0eff3ee082c55ade229975a9adcfab927"} Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.457848 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" event={"ID":"aa77bfe8-fbc4-42c5-923a-2909909db58d","Type":"ContainerDied","Data":"fc0a5e5909fdaccf8993028933f4575a3294d9c60458d2ec79a5bd712e094d46"} Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.457893 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc0a5e5909fdaccf8993028933f4575a3294d9c60458d2ec79a5bd712e094d46" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.520763 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.520832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.520869 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fth\" (UniqueName: \"kubernetes.io/projected/6a776482-53fb-409c-a62b-22f41749eb7b-kube-api-access-n6fth\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.520935 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.520967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.521016 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.521090 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.521934 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.522258 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.529991 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-h7qp8"] Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.530538 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.530590 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c94ac0752a1dcb91ec40ba4c560720e8a8734d2d1a06b78b6730ccf35fc18fc/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.537686 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wtc2k"] Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.619222 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.620347 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.623270 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fth\" (UniqueName: \"kubernetes.io/projected/6a776482-53fb-409c-a62b-22f41749eb7b-kube-api-access-n6fth\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.624048 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:33 crc kubenswrapper[4762]: I0217 14:29:33.663371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.252125 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.392779 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smktq" Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.412687 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.422112 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.670710 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.934360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wtc2k" event={"ID":"cc27563b-a5bb-4e82-a286-e0628e7c07b3","Type":"ContainerStarted","Data":"54d95b91a106a65b0420660b469bd04a1f3c060ba563e73230620e8f7980b08c"} Feb 17 14:29:34 crc kubenswrapper[4762]: I0217 14:29:34.939060 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h7qp8" event={"ID":"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3","Type":"ContainerStarted","Data":"f1cb6d2599641f1ecb30bbc8c92a196820b493f5dbba104ea486b3f88b03dc72"} Feb 17 14:29:35 crc kubenswrapper[4762]: I0217 14:29:35.397510 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lq7n6"] Feb 17 14:29:35 crc kubenswrapper[4762]: I0217 14:29:35.485091 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-6jnwv"] Feb 17 14:29:35 crc kubenswrapper[4762]: I0217 14:29:35.772148 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-95lkq"] Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.013448 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" event={"ID":"0b031b2f-52a6-403f-a100-198a4edacc4b","Type":"ContainerStarted","Data":"a91e2eb5154fd9bc4c6262bd2b05158b13a207dce41c057dd125311e8aeec86f"} Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.016827 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lq7n6" event={"ID":"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64","Type":"ContainerStarted","Data":"fdb14fa858fb20e0a11d66cce487ff3929657dd1d7d60d1ae2f3b3e5601969c5"} Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.045920 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-95lkq" event={"ID":"d6ea0210-709e-4a47-87d1-48c811c0ab85","Type":"ContainerStarted","Data":"13d60409a852050d074383c44514d04956a2cf3fe81d23caad70f81fadf9f8f3"} Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.063128 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wtc2k" event={"ID":"cc27563b-a5bb-4e82-a286-e0628e7c07b3","Type":"ContainerStarted","Data":"cd1e6e1172c720beeffc6bfbd56af158da86b64d766a642b82e86e719c4d0803"} Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.126370 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wtc2k" podStartSLOduration=7.126341272 podStartE2EDuration="7.126341272s" podCreationTimestamp="2026-02-17 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:36.098998539 +0000 UTC m=+1456.678999211" watchObservedRunningTime="2026-02-17 14:29:36.126341272 +0000 UTC m=+1456.706341924" Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.221986 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.254063 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.255683 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.400537 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlkxk\" (UniqueName: \"kubernetes.io/projected/aa77bfe8-fbc4-42c5-923a-2909909db58d-kube-api-access-zlkxk\") pod \"aa77bfe8-fbc4-42c5-923a-2909909db58d\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.400911 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-sb\") pod \"aa77bfe8-fbc4-42c5-923a-2909909db58d\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.400959 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-svc\") pod \"aa77bfe8-fbc4-42c5-923a-2909909db58d\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.401004 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-nb\") pod \"aa77bfe8-fbc4-42c5-923a-2909909db58d\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.401097 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-config\") pod \"aa77bfe8-fbc4-42c5-923a-2909909db58d\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.401149 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-swift-storage-0\") pod \"aa77bfe8-fbc4-42c5-923a-2909909db58d\" (UID: \"aa77bfe8-fbc4-42c5-923a-2909909db58d\") " Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.419995 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa77bfe8-fbc4-42c5-923a-2909909db58d-kube-api-access-zlkxk" (OuterVolumeSpecName: "kube-api-access-zlkxk") pod "aa77bfe8-fbc4-42c5-923a-2909909db58d" (UID: "aa77bfe8-fbc4-42c5-923a-2909909db58d"). InnerVolumeSpecName "kube-api-access-zlkxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.486800 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-2pthv"] Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.507955 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlkxk\" (UniqueName: \"kubernetes.io/projected/aa77bfe8-fbc4-42c5-923a-2909909db58d-kube-api-access-zlkxk\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.510387 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.534851 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-smktq"] Feb 17 14:29:36 crc kubenswrapper[4762]: I0217 14:29:36.770374 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.235721 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa77bfe8-fbc4-42c5-923a-2909909db58d" (UID: "aa77bfe8-fbc4-42c5-923a-2909909db58d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.349398 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.407895 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerStarted","Data":"d2dce3d6df3d3d924acc24709f937ab62f744b764e99c4ad4f86c384d3d0b733"} Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.422232 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa77bfe8-fbc4-42c5-923a-2909909db58d" (UID: "aa77bfe8-fbc4-42c5-923a-2909909db58d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.458363 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smktq" event={"ID":"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1","Type":"ContainerStarted","Data":"969796ab12ea8175a5a692ef56eb31d465b47c897c75995370e429effdbfad68"} Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.458477 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.178:5353: i/o timeout" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.473672 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.477253 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" event={"ID":"6fe335f8-8a53-40c6-99ca-d106d01d65f5","Type":"ContainerStarted","Data":"daa12be9136315a7ea901928c1b8cf881f724e11a2a357553880e4f4b82d665b"} Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.488798 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-config" (OuterVolumeSpecName: "config") pod "aa77bfe8-fbc4-42c5-923a-2909909db58d" (UID: "aa77bfe8-fbc4-42c5-923a-2909909db58d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.567029 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llc75" event={"ID":"30a7292d-960b-40f9-8b50-e6150d20d2b1","Type":"ContainerStarted","Data":"3ca505da16de76261387772b87b6a5926a9c46cd51520a42e4b6302224132fcf"} Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.601891 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.616478 4762 generic.go:334] "Generic (PLEG): container finished" podID="89bb3fe3-d9c4-4292-8a16-79abd3522621" containerID="704b7564d2e7faaca77f1ee0311a82f8d81913d4f276d9f6d7e56b39fb337450" exitCode=0 Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.616607 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.618561 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" event={"ID":"89bb3fe3-d9c4-4292-8a16-79abd3522621","Type":"ContainerDied","Data":"704b7564d2e7faaca77f1ee0311a82f8d81913d4f276d9f6d7e56b39fb337450"} Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.633158 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.655909 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.721898 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lq7w9"] Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.755669 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-llc75" podStartSLOduration=9.755612503 podStartE2EDuration="9.755612503s" podCreationTimestamp="2026-02-17 14:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:37.628244513 +0000 UTC m=+1458.208245185" watchObservedRunningTime="2026-02-17 14:29:37.755612503 +0000 UTC m=+1458.335613155" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.811208 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa77bfe8-fbc4-42c5-923a-2909909db58d" (UID: "aa77bfe8-fbc4-42c5-923a-2909909db58d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.824388 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa77bfe8-fbc4-42c5-923a-2909909db58d" (UID: "aa77bfe8-fbc4-42c5-923a-2909909db58d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.825934 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.817869 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.834291 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:37 crc kubenswrapper[4762]: I0217 14:29:37.929659 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa77bfe8-fbc4-42c5-923a-2909909db58d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.376530 4762 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod80db8f3d-cc50-4a3e-8cad-52f614221b4d"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod80db8f3d-cc50-4a3e-8cad-52f614221b4d] : Timed out while waiting for systemd to remove kubepods-burstable-pod80db8f3d_cc50_4a3e_8cad_52f614221b4d.slice" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.574209 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.684966 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" event={"ID":"89bb3fe3-d9c4-4292-8a16-79abd3522621","Type":"ContainerDied","Data":"d5fdfb9d0075f1ae3a13a13ff5c2e9c0eff3ee082c55ade229975a9adcfab927"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.685049 4762 scope.go:117] "RemoveContainer" containerID="704b7564d2e7faaca77f1ee0311a82f8d81913d4f276d9f6d7e56b39fb337450" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.685224 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-zxtc5" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.689232 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae775d0e-8b93-454c-bbd5-6c06937759dd","Type":"ContainerStarted","Data":"6c5d7da2cfc0d3e7a645abc906a398a25a2559c66e10702d6d8b6dba13e4ea20"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.695147 4762 generic.go:334] "Generic (PLEG): container finished" podID="0b031b2f-52a6-403f-a100-198a4edacc4b" containerID="9364a049d67000d238a3e572f01d668fb0dcfc5140e92a736522b9afd7064ef2" exitCode=0 Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.695233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" event={"ID":"0b031b2f-52a6-403f-a100-198a4edacc4b","Type":"ContainerDied","Data":"9364a049d67000d238a3e572f01d668fb0dcfc5140e92a736522b9afd7064ef2"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.708458 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a776482-53fb-409c-a62b-22f41749eb7b","Type":"ContainerStarted","Data":"1999f4c8fe6c64ffd51e57cc0d18d220b7831edbdb8e620308824d1cd363aa08"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.713691 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lq7w9" event={"ID":"16658e34-885b-4693-9784-bd985a6acd52","Type":"ContainerStarted","Data":"03bedf90d9de4202da4df646416d5c25cf7f7c0b4f1a31a1cfc7b603b022827f"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.713740 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lq7w9" event={"ID":"16658e34-885b-4693-9784-bd985a6acd52","Type":"ContainerStarted","Data":"36ed2eee59397a6a7057740bc406fd21381b738d570cea66e48c863463b666a6"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.740982 4762 generic.go:334] "Generic (PLEG): container finished" podID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerID="2044375e66eb74aa89a42d758449bbdffc23deab5ea26f284fe1a52af5696bb4" exitCode=0 Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.741347 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" event={"ID":"6fe335f8-8a53-40c6-99ca-d106d01d65f5","Type":"ContainerDied","Data":"2044375e66eb74aa89a42d758449bbdffc23deab5ea26f284fe1a52af5696bb4"} Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.749911 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-lq7w9" podStartSLOduration=7.749891324 podStartE2EDuration="7.749891324s" podCreationTimestamp="2026-02-17 14:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:38.740880999 +0000 UTC m=+1459.320881651" watchObservedRunningTime="2026-02-17 14:29:38.749891324 +0000 UTC m=+1459.329891976" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.767399 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-svc\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.767463 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.767582 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-swift-storage-0\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.767660 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-sb\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.767708 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-config\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.767734 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzgs9\" (UniqueName: \"kubernetes.io/projected/89bb3fe3-d9c4-4292-8a16-79abd3522621-kube-api-access-gzgs9\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.794967 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89bb3fe3-d9c4-4292-8a16-79abd3522621-kube-api-access-gzgs9" (OuterVolumeSpecName: "kube-api-access-gzgs9") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621"). InnerVolumeSpecName "kube-api-access-gzgs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.832159 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.838097 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.840635 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:38 crc kubenswrapper[4762]: E0217 14:29:38.847997 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb podName:89bb3fe3-d9c4-4292-8a16-79abd3522621 nodeName:}" failed. No retries permitted until 2026-02-17 14:29:39.347961058 +0000 UTC m=+1459.927961710 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621") : error deleting /var/lib/kubelet/pods/89bb3fe3-d9c4-4292-8a16-79abd3522621/volume-subpaths: remove /var/lib/kubelet/pods/89bb3fe3-d9c4-4292-8a16-79abd3522621/volume-subpaths: no such file or directory Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.848135 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-config" (OuterVolumeSpecName: "config") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.875866 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.875895 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.875926 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.875938 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:38 crc kubenswrapper[4762]: I0217 14:29:38.875946 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzgs9\" (UniqueName: \"kubernetes.io/projected/89bb3fe3-d9c4-4292-8a16-79abd3522621-kube-api-access-gzgs9\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.393488 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb\") pod \"89bb3fe3-d9c4-4292-8a16-79abd3522621\" (UID: \"89bb3fe3-d9c4-4292-8a16-79abd3522621\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.395514 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89bb3fe3-d9c4-4292-8a16-79abd3522621" (UID: "89bb3fe3-d9c4-4292-8a16-79abd3522621"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.421869 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.439684 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.445771 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.496090 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-swift-storage-0\") pod \"0b031b2f-52a6-403f-a100-198a4edacc4b\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.496178 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgw7j\" (UniqueName: \"kubernetes.io/projected/0b031b2f-52a6-403f-a100-198a4edacc4b-kube-api-access-hgw7j\") pod \"0b031b2f-52a6-403f-a100-198a4edacc4b\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.496383 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-svc\") pod \"0b031b2f-52a6-403f-a100-198a4edacc4b\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.496421 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-sb\") pod \"0b031b2f-52a6-403f-a100-198a4edacc4b\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.496437 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-config\") pod \"0b031b2f-52a6-403f-a100-198a4edacc4b\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.496481 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-nb\") pod \"0b031b2f-52a6-403f-a100-198a4edacc4b\" (UID: \"0b031b2f-52a6-403f-a100-198a4edacc4b\") " Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.498692 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89bb3fe3-d9c4-4292-8a16-79abd3522621-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.636739 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b031b2f-52a6-403f-a100-198a4edacc4b-kube-api-access-hgw7j" (OuterVolumeSpecName: "kube-api-access-hgw7j") pod "0b031b2f-52a6-403f-a100-198a4edacc4b" (UID: "0b031b2f-52a6-403f-a100-198a4edacc4b"). InnerVolumeSpecName "kube-api-access-hgw7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.722530 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgw7j\" (UniqueName: \"kubernetes.io/projected/0b031b2f-52a6-403f-a100-198a4edacc4b-kube-api-access-hgw7j\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.756086 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-zxtc5"] Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.791161 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b031b2f-52a6-403f-a100-198a4edacc4b" (UID: "0b031b2f-52a6-403f-a100-198a4edacc4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.799708 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0b031b2f-52a6-403f-a100-198a4edacc4b" (UID: "0b031b2f-52a6-403f-a100-198a4edacc4b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:39 crc kubenswrapper[4762]: I0217 14:29:39.801508 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-zxtc5"] Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.056494 4762 generic.go:334] "Generic (PLEG): container finished" podID="16658e34-885b-4693-9784-bd985a6acd52" containerID="03bedf90d9de4202da4df646416d5c25cf7f7c0b4f1a31a1cfc7b603b022827f" exitCode=0 Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.056574 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lq7w9" event={"ID":"16658e34-885b-4693-9784-bd985a6acd52","Type":"ContainerDied","Data":"03bedf90d9de4202da4df646416d5c25cf7f7c0b4f1a31a1cfc7b603b022827f"} Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.061187 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.061212 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.062453 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" event={"ID":"6fe335f8-8a53-40c6-99ca-d106d01d65f5","Type":"ContainerStarted","Data":"3e9db673b2d22c3ee5af98435d6d8153a2110c9ba0f7085e32fb5322ff6efaf0"} Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.062942 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.068023 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.068344 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-6jnwv" event={"ID":"0b031b2f-52a6-403f-a100-198a4edacc4b","Type":"ContainerDied","Data":"a91e2eb5154fd9bc4c6262bd2b05158b13a207dce41c057dd125311e8aeec86f"} Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.068420 4762 scope.go:117] "RemoveContainer" containerID="9364a049d67000d238a3e572f01d668fb0dcfc5140e92a736522b9afd7064ef2" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.122285 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89bb3fe3-d9c4-4292-8a16-79abd3522621" path="/var/lib/kubelet/pods/89bb3fe3-d9c4-4292-8a16-79abd3522621/volumes" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.164091 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" podStartSLOduration=10.164063192 podStartE2EDuration="10.164063192s" podCreationTimestamp="2026-02-17 14:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:40.149570068 +0000 UTC m=+1460.729570740" watchObservedRunningTime="2026-02-17 14:29:40.164063192 +0000 UTC m=+1460.744063844" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.337085 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-config" (OuterVolumeSpecName: "config") pod "0b031b2f-52a6-403f-a100-198a4edacc4b" (UID: "0b031b2f-52a6-403f-a100-198a4edacc4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.404492 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.434325 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b031b2f-52a6-403f-a100-198a4edacc4b" (UID: "0b031b2f-52a6-403f-a100-198a4edacc4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.453272 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b031b2f-52a6-403f-a100-198a4edacc4b" (UID: "0b031b2f-52a6-403f-a100-198a4edacc4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.508375 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.508413 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b031b2f-52a6-403f-a100-198a4edacc4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:40 crc kubenswrapper[4762]: I0217 14:29:40.577374 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 17 14:29:41 crc kubenswrapper[4762]: I0217 14:29:41.061495 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-6jnwv"] Feb 17 14:29:41 crc kubenswrapper[4762]: I0217 14:29:41.073301 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-6jnwv"] Feb 17 14:29:41 crc kubenswrapper[4762]: I0217 14:29:41.110229 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a776482-53fb-409c-a62b-22f41749eb7b","Type":"ContainerStarted","Data":"29b7ffe950bff3b23eb36343764930305d9e88e07568ec5b999ba75787a9c410"} Feb 17 14:29:41 crc kubenswrapper[4762]: I0217 14:29:41.112704 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae775d0e-8b93-454c-bbd5-6c06937759dd","Type":"ContainerStarted","Data":"ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1"} Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.232606 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.298094 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b031b2f-52a6-403f-a100-198a4edacc4b" path="/var/lib/kubelet/pods/0b031b2f-52a6-403f-a100-198a4edacc4b/volumes" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.322503 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lq7w9" event={"ID":"16658e34-885b-4693-9784-bd985a6acd52","Type":"ContainerDied","Data":"36ed2eee59397a6a7057740bc406fd21381b738d570cea66e48c863463b666a6"} Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.322560 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ed2eee59397a6a7057740bc406fd21381b738d570cea66e48c863463b666a6" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.322624 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lq7w9" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.323994 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16658e34-885b-4693-9784-bd985a6acd52-operator-scripts\") pod \"16658e34-885b-4693-9784-bd985a6acd52\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.324124 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtlvt\" (UniqueName: \"kubernetes.io/projected/16658e34-885b-4693-9784-bd985a6acd52-kube-api-access-mtlvt\") pod \"16658e34-885b-4693-9784-bd985a6acd52\" (UID: \"16658e34-885b-4693-9784-bd985a6acd52\") " Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.325321 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16658e34-885b-4693-9784-bd985a6acd52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16658e34-885b-4693-9784-bd985a6acd52" (UID: "16658e34-885b-4693-9784-bd985a6acd52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.336003 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16658e34-885b-4693-9784-bd985a6acd52-kube-api-access-mtlvt" (OuterVolumeSpecName: "kube-api-access-mtlvt") pod "16658e34-885b-4693-9784-bd985a6acd52" (UID: "16658e34-885b-4693-9784-bd985a6acd52"). InnerVolumeSpecName "kube-api-access-mtlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.427299 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16658e34-885b-4693-9784-bd985a6acd52-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:42 crc kubenswrapper[4762]: I0217 14:29:42.427334 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtlvt\" (UniqueName: \"kubernetes.io/projected/16658e34-885b-4693-9784-bd985a6acd52-kube-api-access-mtlvt\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4762]: I0217 14:29:43.602880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae775d0e-8b93-454c-bbd5-6c06937759dd","Type":"ContainerStarted","Data":"2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098"} Feb 17 14:29:44 crc kubenswrapper[4762]: I0217 14:29:44.795636 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-log" containerID="cri-o://ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1" gracePeriod=30 Feb 17 14:29:44 crc kubenswrapper[4762]: I0217 14:29:44.805409 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-httpd" containerID="cri-o://2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098" gracePeriod=30 Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.841775 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098" cmd=["/bin/true"] Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.841921 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1" cmd=["/bin/true"] Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.852154 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098" cmd=["/bin/true"] Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.856486 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1" cmd=["/bin/true"] Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.877516 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098" cmd=["/bin/true"] Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.877615 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Startup" pod="openstack/glance-default-external-api-0" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-httpd" Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.884115 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1" cmd=["/bin/true"] Feb 17 14:29:44 crc kubenswrapper[4762]: E0217 14:29:44.884179 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Startup" pod="openstack/glance-default-external-api-0" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-log" Feb 17 14:29:44 crc kubenswrapper[4762]: I0217 14:29:44.890151 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:29:44 crc kubenswrapper[4762]: I0217 14:29:44.890192 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:29:44 crc kubenswrapper[4762]: I0217 14:29:44.890251 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:29:44 crc kubenswrapper[4762]: I0217 14:29:44.945329 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.945304681 podStartE2EDuration="15.945304681s" podCreationTimestamp="2026-02-17 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:44.915318306 +0000 UTC m=+1465.495318958" watchObservedRunningTime="2026-02-17 14:29:44.945304681 +0000 UTC m=+1465.525305333" Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.138511 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vmhb8"] Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.143449 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" podUID="366b755e-ebe1-4687-861b-39bb7892755a" containerName="dnsmasq-dns" containerID="cri-o://7f566a33f9382c001ceed3943d020ad43b69ea5c37d95501b57d60e015193888" gracePeriod=10 Feb 17 14:29:45 crc kubenswrapper[4762]: E0217 14:29:45.793506 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae775d0e_8b93_454c_bbd5_6c06937759dd.slice/crio-ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae775d0e_8b93_454c_bbd5_6c06937759dd.slice/crio-conmon-ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae775d0e_8b93_454c_bbd5_6c06937759dd.slice/crio-conmon-2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod366b755e_ebe1_4687_861b_39bb7892755a.slice/crio-7f566a33f9382c001ceed3943d020ad43b69ea5c37d95501b57d60e015193888.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae775d0e_8b93_454c_bbd5_6c06937759dd.slice/crio-2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.933859 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerID="2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098" exitCode=143 Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.933903 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerID="ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1" exitCode=143 Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.933958 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae775d0e-8b93-454c-bbd5-6c06937759dd","Type":"ContainerDied","Data":"2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098"} Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.934003 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae775d0e-8b93-454c-bbd5-6c06937759dd","Type":"ContainerDied","Data":"ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1"} Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.941649 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a776482-53fb-409c-a62b-22f41749eb7b","Type":"ContainerStarted","Data":"8a64a51e621631fb99492a652a99359a34add2a7fe9b9dbd5f466af479e4c423"} Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.941854 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-log" containerID="cri-o://29b7ffe950bff3b23eb36343764930305d9e88e07568ec5b999ba75787a9c410" gracePeriod=30 Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.942607 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-httpd" containerID="cri-o://8a64a51e621631fb99492a652a99359a34add2a7fe9b9dbd5f466af479e4c423" gracePeriod=30 Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.956034 4762 generic.go:334] "Generic (PLEG): container finished" podID="366b755e-ebe1-4687-861b-39bb7892755a" containerID="7f566a33f9382c001ceed3943d020ad43b69ea5c37d95501b57d60e015193888" exitCode=0 Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.956117 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" event={"ID":"366b755e-ebe1-4687-861b-39bb7892755a","Type":"ContainerDied","Data":"7f566a33f9382c001ceed3943d020ad43b69ea5c37d95501b57d60e015193888"} Feb 17 14:29:45 crc kubenswrapper[4762]: I0217 14:29:45.975308 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.975286461 podStartE2EDuration="15.975286461s" podCreationTimestamp="2026-02-17 14:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:45.969609967 +0000 UTC m=+1466.549610619" watchObservedRunningTime="2026-02-17 14:29:45.975286461 +0000 UTC m=+1466.555287113" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.163937 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.175223 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.201353 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.202033 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vmhb8" event={"ID":"366b755e-ebe1-4687-861b-39bb7892755a","Type":"ContainerDied","Data":"a4ace29e2d4b4ff9032bdaba7cfaf401d3b8141bca49195b8d712bb31790c124"} Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.202099 4762 scope.go:117] "RemoveContainer" containerID="7f566a33f9382c001ceed3943d020ad43b69ea5c37d95501b57d60e015193888" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.226170 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ae775d0e-8b93-454c-bbd5-6c06937759dd","Type":"ContainerDied","Data":"6c5d7da2cfc0d3e7a645abc906a398a25a2559c66e10702d6d8b6dba13e4ea20"} Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.226204 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.253561 4762 generic.go:334] "Generic (PLEG): container finished" podID="6a776482-53fb-409c-a62b-22f41749eb7b" containerID="29b7ffe950bff3b23eb36343764930305d9e88e07568ec5b999ba75787a9c410" exitCode=143 Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.253606 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a776482-53fb-409c-a62b-22f41749eb7b","Type":"ContainerDied","Data":"29b7ffe950bff3b23eb36343764930305d9e88e07568ec5b999ba75787a9c410"} Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.315823 4762 scope.go:117] "RemoveContainer" containerID="69e0d25e32180c6841c0d805ed308ef91a5b22c4e5ac3a36b2161727223b1837" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.337625 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.337974 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-config\") pod \"366b755e-ebe1-4687-861b-39bb7892755a\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338008 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-httpd-run\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338031 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-nb\") pod \"366b755e-ebe1-4687-861b-39bb7892755a\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338083 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-combined-ca-bundle\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338126 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-dns-svc\") pod \"366b755e-ebe1-4687-861b-39bb7892755a\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338252 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-sb\") pod \"366b755e-ebe1-4687-861b-39bb7892755a\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338277 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-logs\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338313 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-config-data\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338339 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txhqr\" (UniqueName: \"kubernetes.io/projected/366b755e-ebe1-4687-861b-39bb7892755a-kube-api-access-txhqr\") pod \"366b755e-ebe1-4687-861b-39bb7892755a\" (UID: \"366b755e-ebe1-4687-861b-39bb7892755a\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338369 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-565k4\" (UniqueName: \"kubernetes.io/projected/ae775d0e-8b93-454c-bbd5-6c06937759dd-kube-api-access-565k4\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.338398 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-scripts\") pod \"ae775d0e-8b93-454c-bbd5-6c06937759dd\" (UID: \"ae775d0e-8b93-454c-bbd5-6c06937759dd\") " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.339780 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.385975 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-logs" (OuterVolumeSpecName: "logs") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.388225 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-scripts" (OuterVolumeSpecName: "scripts") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.388836 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366b755e-ebe1-4687-861b-39bb7892755a-kube-api-access-txhqr" (OuterVolumeSpecName: "kube-api-access-txhqr") pod "366b755e-ebe1-4687-861b-39bb7892755a" (UID: "366b755e-ebe1-4687-861b-39bb7892755a"). InnerVolumeSpecName "kube-api-access-txhqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.407201 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae775d0e-8b93-454c-bbd5-6c06937759dd-kube-api-access-565k4" (OuterVolumeSpecName: "kube-api-access-565k4") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "kube-api-access-565k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.441306 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.441345 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae775d0e-8b93-454c-bbd5-6c06937759dd-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.441360 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txhqr\" (UniqueName: \"kubernetes.io/projected/366b755e-ebe1-4687-861b-39bb7892755a-kube-api-access-txhqr\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.441374 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-565k4\" (UniqueName: \"kubernetes.io/projected/ae775d0e-8b93-454c-bbd5-6c06937759dd-kube-api-access-565k4\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.441385 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.462413 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.602288 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.676434 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "366b755e-ebe1-4687-861b-39bb7892755a" (UID: "366b755e-ebe1-4687-861b-39bb7892755a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.686185 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "366b755e-ebe1-4687-861b-39bb7892755a" (UID: "366b755e-ebe1-4687-861b-39bb7892755a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.708742 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.708771 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.714165 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-config" (OuterVolumeSpecName: "config") pod "366b755e-ebe1-4687-861b-39bb7892755a" (UID: "366b755e-ebe1-4687-861b-39bb7892755a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.738895 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-config-data" (OuterVolumeSpecName: "config-data") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.753940 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "366b755e-ebe1-4687-861b-39bb7892755a" (UID: "366b755e-ebe1-4687-861b-39bb7892755a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.815083 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae775d0e-8b93-454c-bbd5-6c06937759dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.815117 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.815125 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/366b755e-ebe1-4687-861b-39bb7892755a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.821025 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e" (OuterVolumeSpecName: "glance") pod "ae775d0e-8b93-454c-bbd5-6c06937759dd" (UID: "ae775d0e-8b93-454c-bbd5-6c06937759dd"). InnerVolumeSpecName "pvc-2f5442b2-466c-497d-97f0-c22697b04d0e". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.863647 4762 scope.go:117] "RemoveContainer" containerID="2d3919a9be47b104d24f5c55aa1f8fef1b5b3c9556a22bf1b52476e2cf26b098" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.869016 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vmhb8"] Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.889558 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vmhb8"] Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.903835 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.930541 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") on node \"crc\" " Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.950326 4762 scope.go:117] "RemoveContainer" containerID="ca40b9601419699063071dd2c47ce7d715628459b737566d237dcccfb022b1b1" Feb 17 14:29:47 crc kubenswrapper[4762]: I0217 14:29:47.950939 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.027868 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029039 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-log" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029068 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-log" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029093 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bb3fe3-d9c4-4292-8a16-79abd3522621" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029099 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bb3fe3-d9c4-4292-8a16-79abd3522621" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029115 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16658e34-885b-4693-9784-bd985a6acd52" containerName="mariadb-account-create-update" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029121 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="16658e34-885b-4693-9784-bd985a6acd52" containerName="mariadb-account-create-update" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029145 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="dnsmasq-dns" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029152 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="dnsmasq-dns" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029166 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029171 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029185 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-httpd" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029192 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-httpd" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029223 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366b755e-ebe1-4687-861b-39bb7892755a" containerName="dnsmasq-dns" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029229 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="366b755e-ebe1-4687-861b-39bb7892755a" containerName="dnsmasq-dns" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029248 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366b755e-ebe1-4687-861b-39bb7892755a" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029254 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="366b755e-ebe1-4687-861b-39bb7892755a" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: E0217 14:29:48.029271 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b031b2f-52a6-403f-a100-198a4edacc4b" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029280 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b031b2f-52a6-403f-a100-198a4edacc4b" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029695 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b031b2f-52a6-403f-a100-198a4edacc4b" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029725 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" containerName="dnsmasq-dns" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029750 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="16658e34-885b-4693-9784-bd985a6acd52" containerName="mariadb-account-create-update" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029762 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="89bb3fe3-d9c4-4292-8a16-79abd3522621" containerName="init" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029782 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="366b755e-ebe1-4687-861b-39bb7892755a" containerName="dnsmasq-dns" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029794 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-httpd" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.029808 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" containerName="glance-log" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.042218 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.042400 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2f5442b2-466c-497d-97f0-c22697b04d0e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e") on node "crc" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.048234 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.048418 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.057189 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.057808 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.114698 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366b755e-ebe1-4687-861b-39bb7892755a" path="/var/lib/kubelet/pods/366b755e-ebe1-4687-861b-39bb7892755a/volumes" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.115821 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae775d0e-8b93-454c-bbd5-6c06937759dd" path="/var/lib/kubelet/pods/ae775d0e-8b93-454c-bbd5-6c06937759dd/volumes" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.138452 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.240342 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.240409 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.240697 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2fq\" (UniqueName: \"kubernetes.io/projected/2a357fec-26ca-4478-8ec4-34b141dbe886-kube-api-access-mt2fq\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.240817 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-logs\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.241069 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.241308 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.241483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-config-data\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.241541 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-scripts\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.286075 4762 generic.go:334] "Generic (PLEG): container finished" podID="6a776482-53fb-409c-a62b-22f41749eb7b" containerID="8a64a51e621631fb99492a652a99359a34add2a7fe9b9dbd5f466af479e4c423" exitCode=0 Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.286137 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a776482-53fb-409c-a62b-22f41749eb7b","Type":"ContainerDied","Data":"8a64a51e621631fb99492a652a99359a34add2a7fe9b9dbd5f466af479e4c423"} Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349567 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349617 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349636 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-config-data\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349678 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-scripts\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349733 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349753 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2fq\" (UniqueName: \"kubernetes.io/projected/2a357fec-26ca-4478-8ec4-34b141dbe886-kube-api-access-mt2fq\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.349833 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-logs\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.350543 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-logs\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.350787 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.356868 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.359525 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.359535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-scripts\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.361740 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.361778 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd98bc01ad401fb0843a9dd71ca408e41c0fbbffed1920afb8717f05abdffdd4/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.366764 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-config-data\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.392498 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2fq\" (UniqueName: \"kubernetes.io/projected/2a357fec-26ca-4478-8ec4-34b141dbe886-kube-api-access-mt2fq\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.428482 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:48 crc kubenswrapper[4762]: I0217 14:29:48.686335 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:49 crc kubenswrapper[4762]: I0217 14:29:49.328334 4762 generic.go:334] "Generic (PLEG): container finished" podID="30a7292d-960b-40f9-8b50-e6150d20d2b1" containerID="3ca505da16de76261387772b87b6a5926a9c46cd51520a42e4b6302224132fcf" exitCode=0 Feb 17 14:29:49 crc kubenswrapper[4762]: I0217 14:29:49.328608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llc75" event={"ID":"30a7292d-960b-40f9-8b50-e6150d20d2b1","Type":"ContainerDied","Data":"3ca505da16de76261387772b87b6a5926a9c46cd51520a42e4b6302224132fcf"} Feb 17 14:29:52 crc kubenswrapper[4762]: I0217 14:29:52.865321 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:52 crc kubenswrapper[4762]: I0217 14:29:52.879925 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.070494 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-scripts\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.070529 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-fernet-keys\") pod \"30a7292d-960b-40f9-8b50-e6150d20d2b1\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.070729 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.071798 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-credential-keys\") pod \"30a7292d-960b-40f9-8b50-e6150d20d2b1\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.071973 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6fth\" (UniqueName: \"kubernetes.io/projected/6a776482-53fb-409c-a62b-22f41749eb7b-kube-api-access-n6fth\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072013 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-logs\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072084 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-config-data\") pod \"30a7292d-960b-40f9-8b50-e6150d20d2b1\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072119 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-scripts\") pod \"30a7292d-960b-40f9-8b50-e6150d20d2b1\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072161 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbfzn\" (UniqueName: \"kubernetes.io/projected/30a7292d-960b-40f9-8b50-e6150d20d2b1-kube-api-access-cbfzn\") pod \"30a7292d-960b-40f9-8b50-e6150d20d2b1\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072247 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-config-data\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072526 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-httpd-run\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072568 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-combined-ca-bundle\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072608 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-combined-ca-bundle\") pod \"30a7292d-960b-40f9-8b50-e6150d20d2b1\" (UID: \"30a7292d-960b-40f9-8b50-e6150d20d2b1\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.072760 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-logs" (OuterVolumeSpecName: "logs") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.073413 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.076031 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.076065 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a776482-53fb-409c-a62b-22f41749eb7b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.077967 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-scripts" (OuterVolumeSpecName: "scripts") pod "30a7292d-960b-40f9-8b50-e6150d20d2b1" (UID: "30a7292d-960b-40f9-8b50-e6150d20d2b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.078811 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a7292d-960b-40f9-8b50-e6150d20d2b1-kube-api-access-cbfzn" (OuterVolumeSpecName: "kube-api-access-cbfzn") pod "30a7292d-960b-40f9-8b50-e6150d20d2b1" (UID: "30a7292d-960b-40f9-8b50-e6150d20d2b1"). InnerVolumeSpecName "kube-api-access-cbfzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.079553 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "30a7292d-960b-40f9-8b50-e6150d20d2b1" (UID: "30a7292d-960b-40f9-8b50-e6150d20d2b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.079635 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a776482-53fb-409c-a62b-22f41749eb7b-kube-api-access-n6fth" (OuterVolumeSpecName: "kube-api-access-n6fth") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "kube-api-access-n6fth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.081229 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-scripts" (OuterVolumeSpecName: "scripts") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.093107 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "30a7292d-960b-40f9-8b50-e6150d20d2b1" (UID: "30a7292d-960b-40f9-8b50-e6150d20d2b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.111795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-config-data" (OuterVolumeSpecName: "config-data") pod "30a7292d-960b-40f9-8b50-e6150d20d2b1" (UID: "30a7292d-960b-40f9-8b50-e6150d20d2b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: E0217 14:29:53.124473 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f podName:6a776482-53fb-409c-a62b-22f41749eb7b nodeName:}" failed. No retries permitted until 2026-02-17 14:29:53.623128121 +0000 UTC m=+1474.203128763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.138174 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30a7292d-960b-40f9-8b50-e6150d20d2b1" (UID: "30a7292d-960b-40f9-8b50-e6150d20d2b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.148628 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180101 4762 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180374 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6fth\" (UniqueName: \"kubernetes.io/projected/6a776482-53fb-409c-a62b-22f41749eb7b-kube-api-access-n6fth\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180496 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180587 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180687 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbfzn\" (UniqueName: \"kubernetes.io/projected/30a7292d-960b-40f9-8b50-e6150d20d2b1-kube-api-access-cbfzn\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180790 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.180878 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.181484 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.181592 4762 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30a7292d-960b-40f9-8b50-e6150d20d2b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.245516 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-config-data" (OuterVolumeSpecName: "config-data") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.284781 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a776482-53fb-409c-a62b-22f41749eb7b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.408927 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llc75" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.408924 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llc75" event={"ID":"30a7292d-960b-40f9-8b50-e6150d20d2b1","Type":"ContainerDied","Data":"fde7764079b617e83424cfb0e79b752fd42252aeb6507a14b6e386d7331c4302"} Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.409596 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fde7764079b617e83424cfb0e79b752fd42252aeb6507a14b6e386d7331c4302" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.416634 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a776482-53fb-409c-a62b-22f41749eb7b","Type":"ContainerDied","Data":"1999f4c8fe6c64ffd51e57cc0d18d220b7831edbdb8e620308824d1cd363aa08"} Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.416721 4762 scope.go:117] "RemoveContainer" containerID="8a64a51e621631fb99492a652a99359a34add2a7fe9b9dbd5f466af479e4c423" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.417029 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.697824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"6a776482-53fb-409c-a62b-22f41749eb7b\" (UID: \"6a776482-53fb-409c-a62b-22f41749eb7b\") " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.734367 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f" (OuterVolumeSpecName: "glance") pod "6a776482-53fb-409c-a62b-22f41749eb7b" (UID: "6a776482-53fb-409c-a62b-22f41749eb7b"). InnerVolumeSpecName "pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.801601 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") on node \"crc\" " Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.837240 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.837439 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f") on node "crc" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.859827 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.875338 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.892736 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:53 crc kubenswrapper[4762]: E0217 14:29:53.893583 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-httpd" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.893610 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-httpd" Feb 17 14:29:53 crc kubenswrapper[4762]: E0217 14:29:53.893635 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-log" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.893703 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-log" Feb 17 14:29:53 crc kubenswrapper[4762]: E0217 14:29:53.893727 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a7292d-960b-40f9-8b50-e6150d20d2b1" containerName="keystone-bootstrap" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.893736 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a7292d-960b-40f9-8b50-e6150d20d2b1" containerName="keystone-bootstrap" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.894251 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-log" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.894285 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" containerName="glance-httpd" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.894307 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a7292d-960b-40f9-8b50-e6150d20d2b1" containerName="keystone-bootstrap" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.896454 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.903234 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.903354 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.903583 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.906136 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:53 crc kubenswrapper[4762]: I0217 14:29:53.994501 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-llc75"] Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.005170 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-llc75"] Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.007857 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.007925 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.007971 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-config-data\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.008377 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-logs\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.008500 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.008572 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctfg\" (UniqueName: \"kubernetes.io/projected/85f7c024-456d-460f-b09f-77b5e8e10498-kube-api-access-rctfg\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.008631 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.008905 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-scripts\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.112698 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-scripts\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.114845 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.115127 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.115295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-config-data\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.115595 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-logs\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.116449 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.118879 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rctfg\" (UniqueName: \"kubernetes.io/projected/85f7c024-456d-460f-b09f-77b5e8e10498-kube-api-access-rctfg\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.119182 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.128098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.128772 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-logs\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.139546 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.168440 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-scripts\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.204584 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a7292d-960b-40f9-8b50-e6150d20d2b1" path="/var/lib/kubelet/pods/30a7292d-960b-40f9-8b50-e6150d20d2b1/volumes" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.208041 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.208088 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c94ac0752a1dcb91ec40ba4c560720e8a8734d2d1a06b78b6730ccf35fc18fc/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.209676 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-config-data\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.210562 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a776482-53fb-409c-a62b-22f41749eb7b" path="/var/lib/kubelet/pods/6a776482-53fb-409c-a62b-22f41749eb7b/volumes" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.211679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.213532 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5mknf"] Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.216498 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5mknf"] Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.216934 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.225270 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.225490 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.225566 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.225742 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jgkd7" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.225857 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.230563 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctfg\" (UniqueName: \"kubernetes.io/projected/85f7c024-456d-460f-b09f-77b5e8e10498-kube-api-access-rctfg\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.312746 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.332222 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-fernet-keys\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.332308 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-scripts\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.332335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-config-data\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.332363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-credential-keys\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.332447 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-combined-ca-bundle\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.332479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcftt\" (UniqueName: \"kubernetes.io/projected/53984f9c-be03-44a6-91da-65972a4b4cd5-kube-api-access-zcftt\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.435236 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-combined-ca-bundle\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.435301 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcftt\" (UniqueName: \"kubernetes.io/projected/53984f9c-be03-44a6-91da-65972a4b4cd5-kube-api-access-zcftt\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.435423 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-fernet-keys\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.435491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-scripts\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.435520 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-config-data\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.435561 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-credential-keys\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.439990 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-config-data\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.440353 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-credential-keys\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.440859 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-scripts\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.444304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-combined-ca-bundle\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.446514 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-fernet-keys\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.453999 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcftt\" (UniqueName: \"kubernetes.io/projected/53984f9c-be03-44a6-91da-65972a4b4cd5-kube-api-access-zcftt\") pod \"keystone-bootstrap-5mknf\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:54 crc kubenswrapper[4762]: I0217 14:29:54.532048 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:55 crc kubenswrapper[4762]: I0217 14:29:55.167930 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:29:59 crc kubenswrapper[4762]: E0217 14:29:59.573063 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 17 14:29:59 crc kubenswrapper[4762]: E0217 14:29:59.573886 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfchb6h9ch98h56bh578h556h9chd8h74h594h649h659h58bhd9h54bhf9hd7h699h589hdch76h579h567h5ch555h648h57dhcbhf9h669h5fbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s6z6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a4d225d9-98bc-48c2-94a2-0c74c3f11d89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.148782 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq"] Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.150793 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.154173 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.155216 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.159016 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq"] Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.295165 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwjd\" (UniqueName: \"kubernetes.io/projected/b4bc181f-2e98-4498-9d56-311e015e6086-kube-api-access-zbwjd\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.295513 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4bc181f-2e98-4498-9d56-311e015e6086-secret-volume\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.295593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4bc181f-2e98-4498-9d56-311e015e6086-config-volume\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.407287 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwjd\" (UniqueName: \"kubernetes.io/projected/b4bc181f-2e98-4498-9d56-311e015e6086-kube-api-access-zbwjd\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.407372 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4bc181f-2e98-4498-9d56-311e015e6086-secret-volume\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.407452 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4bc181f-2e98-4498-9d56-311e015e6086-config-volume\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.408386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4bc181f-2e98-4498-9d56-311e015e6086-config-volume\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.413315 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4bc181f-2e98-4498-9d56-311e015e6086-secret-volume\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.430238 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwjd\" (UniqueName: \"kubernetes.io/projected/b4bc181f-2e98-4498-9d56-311e015e6086-kube-api-access-zbwjd\") pod \"collect-profiles-29522310-ttbbq\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:00 crc kubenswrapper[4762]: I0217 14:30:00.485826 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:08 crc kubenswrapper[4762]: I0217 14:30:08.140008 4762 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podaa77bfe8-fbc4-42c5-923a-2909909db58d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podaa77bfe8-fbc4-42c5-923a-2909909db58d] : Timed out while waiting for systemd to remove kubepods-besteffort-podaa77bfe8_fbc4_42c5_923a_2909909db58d.slice" Feb 17 14:30:08 crc kubenswrapper[4762]: E0217 14:30:08.140718 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podaa77bfe8-fbc4-42c5-923a-2909909db58d] : unable to destroy cgroup paths for cgroup [kubepods besteffort podaa77bfe8-fbc4-42c5-923a-2909909db58d] : Timed out while waiting for systemd to remove kubepods-besteffort-podaa77bfe8_fbc4_42c5_923a_2909909db58d.slice" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" Feb 17 14:30:08 crc kubenswrapper[4762]: I0217 14:30:08.310395 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jzb4k" Feb 17 14:30:08 crc kubenswrapper[4762]: I0217 14:30:08.351450 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jzb4k"] Feb 17 14:30:08 crc kubenswrapper[4762]: I0217 14:30:08.360597 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jzb4k"] Feb 17 14:30:10 crc kubenswrapper[4762]: I0217 14:30:10.085276 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa77bfe8-fbc4-42c5-923a-2909909db58d" path="/var/lib/kubelet/pods/aa77bfe8-fbc4-42c5-923a-2909909db58d/volumes" Feb 17 14:30:10 crc kubenswrapper[4762]: I0217 14:30:10.347321 4762 generic.go:334] "Generic (PLEG): container finished" podID="cc27563b-a5bb-4e82-a286-e0628e7c07b3" containerID="cd1e6e1172c720beeffc6bfbd56af158da86b64d766a642b82e86e719c4d0803" exitCode=0 Feb 17 14:30:10 crc kubenswrapper[4762]: I0217 14:30:10.347364 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wtc2k" event={"ID":"cc27563b-a5bb-4e82-a286-e0628e7c07b3","Type":"ContainerDied","Data":"cd1e6e1172c720beeffc6bfbd56af158da86b64d766a642b82e86e719c4d0803"} Feb 17 14:30:15 crc kubenswrapper[4762]: E0217 14:30:15.561466 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Feb 17 14:30:15 crc kubenswrapper[4762]: E0217 14:30:15.562185 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wz4t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-h7qp8_openstack(8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:30:15 crc kubenswrapper[4762]: E0217 14:30:15.563354 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-h7qp8" podUID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" Feb 17 14:30:16 crc kubenswrapper[4762]: E0217 14:30:16.074478 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 17 14:30:16 crc kubenswrapper[4762]: E0217 14:30:16.074907 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6n47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-smktq_openstack(a9c276b7-cca9-42c7-8605-5f2bfa0da0e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:30:16 crc kubenswrapper[4762]: E0217 14:30:16.076750 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-smktq" podUID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.106634 4762 scope.go:117] "RemoveContainer" containerID="29b7ffe950bff3b23eb36343764930305d9e88e07568ec5b999ba75787a9c410" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.200621 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.307182 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-combined-ca-bundle\") pod \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.307323 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-config\") pod \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.307525 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shf9n\" (UniqueName: \"kubernetes.io/projected/cc27563b-a5bb-4e82-a286-e0628e7c07b3-kube-api-access-shf9n\") pod \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\" (UID: \"cc27563b-a5bb-4e82-a286-e0628e7c07b3\") " Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.315941 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc27563b-a5bb-4e82-a286-e0628e7c07b3-kube-api-access-shf9n" (OuterVolumeSpecName: "kube-api-access-shf9n") pod "cc27563b-a5bb-4e82-a286-e0628e7c07b3" (UID: "cc27563b-a5bb-4e82-a286-e0628e7c07b3"). InnerVolumeSpecName "kube-api-access-shf9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.340090 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-config" (OuterVolumeSpecName: "config") pod "cc27563b-a5bb-4e82-a286-e0628e7c07b3" (UID: "cc27563b-a5bb-4e82-a286-e0628e7c07b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.348682 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc27563b-a5bb-4e82-a286-e0628e7c07b3" (UID: "cc27563b-a5bb-4e82-a286-e0628e7c07b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.414556 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shf9n\" (UniqueName: \"kubernetes.io/projected/cc27563b-a5bb-4e82-a286-e0628e7c07b3-kube-api-access-shf9n\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.414600 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.414621 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc27563b-a5bb-4e82-a286-e0628e7c07b3-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.421909 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wtc2k" Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.421899 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wtc2k" event={"ID":"cc27563b-a5bb-4e82-a286-e0628e7c07b3","Type":"ContainerDied","Data":"54d95b91a106a65b0420660b469bd04a1f3c060ba563e73230620e8f7980b08c"} Feb 17 14:30:16 crc kubenswrapper[4762]: I0217 14:30:16.422104 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d95b91a106a65b0420660b469bd04a1f3c060ba563e73230620e8f7980b08c" Feb 17 14:30:16 crc kubenswrapper[4762]: E0217 14:30:16.429615 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-smktq" podUID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" Feb 17 14:30:16 crc kubenswrapper[4762]: E0217 14:30:16.437812 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-h7qp8" podUID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" Feb 17 14:30:16 crc kubenswrapper[4762]: E0217 14:30:16.552309 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27563b_a5bb_4e82_a286_e0628e7c07b3.slice/crio-54d95b91a106a65b0420660b469bd04a1f3c060ba563e73230620e8f7980b08c\": RecentStats: unable to find data in memory cache]" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.379561 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-gd7pw"] Feb 17 14:30:17 crc kubenswrapper[4762]: E0217 14:30:17.380348 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc27563b-a5bb-4e82-a286-e0628e7c07b3" containerName="neutron-db-sync" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.380363 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc27563b-a5bb-4e82-a286-e0628e7c07b3" containerName="neutron-db-sync" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.380626 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc27563b-a5bb-4e82-a286-e0628e7c07b3" containerName="neutron-db-sync" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.381800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.399835 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-gd7pw"] Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.438136 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-svc\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.438201 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqwph\" (UniqueName: \"kubernetes.io/projected/8befecb9-4510-4921-a212-e80a8b832855-kube-api-access-lqwph\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.438366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-config\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.438423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.438516 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.438582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.543131 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.543228 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.543314 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-svc\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.543348 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqwph\" (UniqueName: \"kubernetes.io/projected/8befecb9-4510-4921-a212-e80a8b832855-kube-api-access-lqwph\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.543450 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-config\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.543509 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.544757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.544774 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.545024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-svc\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.545556 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.547190 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-config\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.587670 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqwph\" (UniqueName: \"kubernetes.io/projected/8befecb9-4510-4921-a212-e80a8b832855-kube-api-access-lqwph\") pod \"dnsmasq-dns-6b7b667979-gd7pw\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.670781 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-675485888-d9mtx"] Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.673184 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.679144 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.679342 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wdfj6" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.679358 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.679382 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.701125 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-675485888-d9mtx"] Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.715315 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.749385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwsd7\" (UniqueName: \"kubernetes.io/projected/ee2eb703-bf85-475a-8fea-fca5c7930dd1-kube-api-access-pwsd7\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.749514 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-ovndb-tls-certs\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.749558 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-combined-ca-bundle\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.749601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-httpd-config\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.749798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-config\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.853891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-httpd-config\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.854225 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-config\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.854323 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsd7\" (UniqueName: \"kubernetes.io/projected/ee2eb703-bf85-475a-8fea-fca5c7930dd1-kube-api-access-pwsd7\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.854427 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-ovndb-tls-certs\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.854478 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-combined-ca-bundle\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.864545 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-config\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.874530 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-ovndb-tls-certs\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.879330 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-httpd-config\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.889091 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-combined-ca-bundle\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: I0217 14:30:17.918665 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsd7\" (UniqueName: \"kubernetes.io/projected/ee2eb703-bf85-475a-8fea-fca5c7930dd1-kube-api-access-pwsd7\") pod \"neutron-675485888-d9mtx\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:17 crc kubenswrapper[4762]: E0217 14:30:17.941378 4762 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 14:30:17 crc kubenswrapper[4762]: E0217 14:30:17.941540 4762 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lrmjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-95lkq_openstack(d6ea0210-709e-4a47-87d1-48c811c0ab85): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:30:17 crc kubenswrapper[4762]: E0217 14:30:17.942743 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-95lkq" podUID="d6ea0210-709e-4a47-87d1-48c811c0ab85" Feb 17 14:30:18 crc kubenswrapper[4762]: I0217 14:30:18.034800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:18 crc kubenswrapper[4762]: E0217 14:30:18.457538 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-95lkq" podUID="d6ea0210-709e-4a47-87d1-48c811c0ab85" Feb 17 14:30:18 crc kubenswrapper[4762]: I0217 14:30:18.588494 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.090825 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5mknf"] Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.383289 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.466817 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq"] Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.474721 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lq7n6" event={"ID":"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64","Type":"ContainerStarted","Data":"e6e299e92349cffa5cd65ef41d287abc4aa99b44f8b6799fabb9fa73461b3607"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.536695 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f7c024-456d-460f-b09f-77b5e8e10498","Type":"ContainerStarted","Data":"c52e7a3c95daf9c0b479235656d1ffc6ff961388e379530da8215e363c02e4db"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.537790 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-gd7pw"] Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.550054 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2a357fec-26ca-4478-8ec4-34b141dbe886","Type":"ContainerStarted","Data":"5882e5f11108e7bb28b49f159bd3440debfcda55922e2e6d17e0c46a9c28451e"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.558804 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" event={"ID":"b4bc181f-2e98-4498-9d56-311e015e6086","Type":"ContainerStarted","Data":"f51033044932308890a904891ca29430b106ceb2d9e3765506f829d5eb36f488"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.560163 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerStarted","Data":"0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.561198 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5mknf" event={"ID":"53984f9c-be03-44a6-91da-65972a4b4cd5","Type":"ContainerStarted","Data":"71f7ef78d1a509cea231aa3153f7810fb048ae9cc5eb752b3f5c691f2c15e8eb"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.562967 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" event={"ID":"8befecb9-4510-4921-a212-e80a8b832855","Type":"ContainerStarted","Data":"e63c95946f220211e49d9be2e6985955101adc0cd48c0a262fc88dded9dff330"} Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.568507 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-675485888-d9mtx"] Feb 17 14:30:19 crc kubenswrapper[4762]: I0217 14:30:19.587042 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-lq7n6" podStartSLOduration=8.813720038 podStartE2EDuration="49.587014288s" podCreationTimestamp="2026-02-17 14:29:30 +0000 UTC" firstStartedPulling="2026-02-17 14:29:35.313442648 +0000 UTC m=+1455.893443300" lastFinishedPulling="2026-02-17 14:30:16.086736898 +0000 UTC m=+1496.666737550" observedRunningTime="2026-02-17 14:30:19.521665782 +0000 UTC m=+1500.101666434" watchObservedRunningTime="2026-02-17 14:30:19.587014288 +0000 UTC m=+1500.167014940" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.043721 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f47bdcf85-g4f9w"] Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.046450 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.064059 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.064127 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.163042 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-httpd-config\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.165185 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-combined-ca-bundle\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.165528 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-public-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.165747 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-internal-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.165881 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-config\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.166014 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-ovndb-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.166107 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lhjh\" (UniqueName: \"kubernetes.io/projected/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-kube-api-access-2lhjh\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.226311 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f47bdcf85-g4f9w"] Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268567 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-public-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268724 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-internal-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268769 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-config\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-ovndb-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268824 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lhjh\" (UniqueName: \"kubernetes.io/projected/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-kube-api-access-2lhjh\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268963 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-httpd-config\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.268991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-combined-ca-bundle\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.282952 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-combined-ca-bundle\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.283546 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-ovndb-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.311460 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-public-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.313904 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-internal-tls-certs\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.318167 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-config\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.319155 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-httpd-config\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.353507 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lhjh\" (UniqueName: \"kubernetes.io/projected/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-kube-api-access-2lhjh\") pod \"neutron-6f47bdcf85-g4f9w\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.415805 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.643090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2a357fec-26ca-4478-8ec4-34b141dbe886","Type":"ContainerStarted","Data":"80f9aa22b822f0b15afdc8fa63b813a132cb5897e20b1c25212e7e3ca7e5cd55"} Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.647247 4762 generic.go:334] "Generic (PLEG): container finished" podID="b4bc181f-2e98-4498-9d56-311e015e6086" containerID="883e6d524b53ef4643b9df01f74b4a6383f3c3b33382aad9db5b6dc136fce5dc" exitCode=0 Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.647312 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" event={"ID":"b4bc181f-2e98-4498-9d56-311e015e6086","Type":"ContainerDied","Data":"883e6d524b53ef4643b9df01f74b4a6383f3c3b33382aad9db5b6dc136fce5dc"} Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.672045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5mknf" event={"ID":"53984f9c-be03-44a6-91da-65972a4b4cd5","Type":"ContainerStarted","Data":"f865c92eac1476eafc2c0c30e7afe7ee2571d6f3d907e473e0ff9d179a5c8edf"} Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.701664 4762 generic.go:334] "Generic (PLEG): container finished" podID="8befecb9-4510-4921-a212-e80a8b832855" containerID="005c50eaea1c444d6f0b66c6862777bbe57b02af1edba0414efc1c5441023635" exitCode=0 Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.702182 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" event={"ID":"8befecb9-4510-4921-a212-e80a8b832855","Type":"ContainerDied","Data":"005c50eaea1c444d6f0b66c6862777bbe57b02af1edba0414efc1c5441023635"} Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.724946 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5mknf" podStartSLOduration=26.724922051 podStartE2EDuration="26.724922051s" podCreationTimestamp="2026-02-17 14:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:20.701620818 +0000 UTC m=+1501.281621470" watchObservedRunningTime="2026-02-17 14:30:20.724922051 +0000 UTC m=+1501.304922703" Feb 17 14:30:20 crc kubenswrapper[4762]: I0217 14:30:20.728529 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675485888-d9mtx" event={"ID":"ee2eb703-bf85-475a-8fea-fca5c7930dd1","Type":"ContainerStarted","Data":"e016c781ba3daa1a33a35740fe8ca67eeaee1607cd632100618f4f5dff090392"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.181011 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f47bdcf85-g4f9w"] Feb 17 14:30:21 crc kubenswrapper[4762]: W0217 14:30:21.201142 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod922b4fd8_4192_45a2_9fad_c6e49f93e9eb.slice/crio-68a4c0f317049f6e5a3b6e386a3b51373cb86361a48c1cf8b73104ded7c8361a WatchSource:0}: Error finding container 68a4c0f317049f6e5a3b6e386a3b51373cb86361a48c1cf8b73104ded7c8361a: Status 404 returned error can't find the container with id 68a4c0f317049f6e5a3b6e386a3b51373cb86361a48c1cf8b73104ded7c8361a Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.786871 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675485888-d9mtx" event={"ID":"ee2eb703-bf85-475a-8fea-fca5c7930dd1","Type":"ContainerStarted","Data":"7eb572168b3935d9726979198fa16470637b31e2930463f584f0deeb0929710b"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.787159 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675485888-d9mtx" event={"ID":"ee2eb703-bf85-475a-8fea-fca5c7930dd1","Type":"ContainerStarted","Data":"8a1cf66aff096f324fbf95108d237110a5c977b8cb857a0ed48d96fbd625213d"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.791598 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-675485888-d9mtx" Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.814464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f7c024-456d-460f-b09f-77b5e8e10498","Type":"ContainerStarted","Data":"edb0b37b8e520ee4aef70d35fcf290ea941c0e99ba43b8495f41be5f2c8163b6"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.827310 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2a357fec-26ca-4478-8ec4-34b141dbe886","Type":"ContainerStarted","Data":"0b62a9d98e888b0e0dc59d942af63064b26f4e10cb512add83ab42d2ca101810"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.829153 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-675485888-d9mtx" podStartSLOduration=4.829130927 podStartE2EDuration="4.829130927s" podCreationTimestamp="2026-02-17 14:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:21.824209784 +0000 UTC m=+1502.404210456" watchObservedRunningTime="2026-02-17 14:30:21.829130927 +0000 UTC m=+1502.409131579" Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.839316 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47bdcf85-g4f9w" event={"ID":"922b4fd8-4192-45a2-9fad-c6e49f93e9eb","Type":"ContainerStarted","Data":"40bfadd0be5a49cf632f62cc2d679da6a27b3b7606bb06e8c319ffb998c7a00a"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.839404 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47bdcf85-g4f9w" event={"ID":"922b4fd8-4192-45a2-9fad-c6e49f93e9eb","Type":"ContainerStarted","Data":"68a4c0f317049f6e5a3b6e386a3b51373cb86361a48c1cf8b73104ded7c8361a"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.846539 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" event={"ID":"8befecb9-4510-4921-a212-e80a8b832855","Type":"ContainerStarted","Data":"01c3bfedbbdda822752c16fbf30ea475f2a4e991d8289023001b4761f36dc674"} Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.862581 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=34.862555045 podStartE2EDuration="34.862555045s" podCreationTimestamp="2026-02-17 14:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:21.847373313 +0000 UTC m=+1502.427373975" watchObservedRunningTime="2026-02-17 14:30:21.862555045 +0000 UTC m=+1502.442555697" Feb 17 14:30:21 crc kubenswrapper[4762]: I0217 14:30:21.904701 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" podStartSLOduration=4.90467809 podStartE2EDuration="4.90467809s" podCreationTimestamp="2026-02-17 14:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:21.877481911 +0000 UTC m=+1502.457482563" watchObservedRunningTime="2026-02-17 14:30:21.90467809 +0000 UTC m=+1502.484678742" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.394417 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.470183 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwjd\" (UniqueName: \"kubernetes.io/projected/b4bc181f-2e98-4498-9d56-311e015e6086-kube-api-access-zbwjd\") pod \"b4bc181f-2e98-4498-9d56-311e015e6086\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.470296 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4bc181f-2e98-4498-9d56-311e015e6086-config-volume\") pod \"b4bc181f-2e98-4498-9d56-311e015e6086\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.470419 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4bc181f-2e98-4498-9d56-311e015e6086-secret-volume\") pod \"b4bc181f-2e98-4498-9d56-311e015e6086\" (UID: \"b4bc181f-2e98-4498-9d56-311e015e6086\") " Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.471859 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bc181f-2e98-4498-9d56-311e015e6086-config-volume" (OuterVolumeSpecName: "config-volume") pod "b4bc181f-2e98-4498-9d56-311e015e6086" (UID: "b4bc181f-2e98-4498-9d56-311e015e6086"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.491277 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4bc181f-2e98-4498-9d56-311e015e6086-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b4bc181f-2e98-4498-9d56-311e015e6086" (UID: "b4bc181f-2e98-4498-9d56-311e015e6086"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.491535 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bc181f-2e98-4498-9d56-311e015e6086-kube-api-access-zbwjd" (OuterVolumeSpecName: "kube-api-access-zbwjd") pod "b4bc181f-2e98-4498-9d56-311e015e6086" (UID: "b4bc181f-2e98-4498-9d56-311e015e6086"). InnerVolumeSpecName "kube-api-access-zbwjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.577742 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4bc181f-2e98-4498-9d56-311e015e6086-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.577779 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwjd\" (UniqueName: \"kubernetes.io/projected/b4bc181f-2e98-4498-9d56-311e015e6086-kube-api-access-zbwjd\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.577792 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4bc181f-2e98-4498-9d56-311e015e6086-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.716254 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.874359 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47bdcf85-g4f9w" event={"ID":"922b4fd8-4192-45a2-9fad-c6e49f93e9eb","Type":"ContainerStarted","Data":"32a94d62c2e7d2a6766a7870466783bc42e46fbe12f626f85b1a7961462224e0"} Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.875400 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.897001 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f7c024-456d-460f-b09f-77b5e8e10498","Type":"ContainerStarted","Data":"269c14e2b5e7f2da1726887ab2d0730d9718b9f869f69708d78797d066565255"} Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.905110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" event={"ID":"b4bc181f-2e98-4498-9d56-311e015e6086","Type":"ContainerDied","Data":"f51033044932308890a904891ca29430b106ceb2d9e3765506f829d5eb36f488"} Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.905167 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f51033044932308890a904891ca29430b106ceb2d9e3765506f829d5eb36f488" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.905259 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-ttbbq" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.941519 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f47bdcf85-g4f9w" podStartSLOduration=3.941496746 podStartE2EDuration="3.941496746s" podCreationTimestamp="2026-02-17 14:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:22.914915054 +0000 UTC m=+1503.494915706" watchObservedRunningTime="2026-02-17 14:30:22.941496746 +0000 UTC m=+1503.521497388" Feb 17 14:30:22 crc kubenswrapper[4762]: I0217 14:30:22.986571 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=29.98654638 podStartE2EDuration="29.98654638s" podCreationTimestamp="2026-02-17 14:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:22.951781285 +0000 UTC m=+1503.531781937" watchObservedRunningTime="2026-02-17 14:30:22.98654638 +0000 UTC m=+1503.566547032" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.532831 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.533122 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.533146 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.533159 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.575947 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.602360 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.623247 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:30:24 crc kubenswrapper[4762]: I0217 14:30:24.623296 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:30:27 crc kubenswrapper[4762]: I0217 14:30:27.717766 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:30:27 crc kubenswrapper[4762]: I0217 14:30:27.779655 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-2pthv"] Feb 17 14:30:27 crc kubenswrapper[4762]: I0217 14:30:27.779936 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="dnsmasq-dns" containerID="cri-o://3e9db673b2d22c3ee5af98435d6d8153a2110c9ba0f7085e32fb5322ff6efaf0" gracePeriod=10 Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.722229 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.722782 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.764356 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.764832 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.977724 4762 generic.go:334] "Generic (PLEG): container finished" podID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerID="3e9db673b2d22c3ee5af98435d6d8153a2110c9ba0f7085e32fb5322ff6efaf0" exitCode=0 Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.977768 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" event={"ID":"6fe335f8-8a53-40c6-99ca-d106d01d65f5","Type":"ContainerDied","Data":"3e9db673b2d22c3ee5af98435d6d8153a2110c9ba0f7085e32fb5322ff6efaf0"} Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.978171 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:30:28 crc kubenswrapper[4762]: I0217 14:30:28.978209 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.541717 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.645327 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-config\") pod \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.645469 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-swift-storage-0\") pod \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.645525 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-sb\") pod \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.645565 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-nb\") pod \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.645598 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-svc\") pod \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.645632 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs6x2\" (UniqueName: \"kubernetes.io/projected/6fe335f8-8a53-40c6-99ca-d106d01d65f5-kube-api-access-rs6x2\") pod \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\" (UID: \"6fe335f8-8a53-40c6-99ca-d106d01d65f5\") " Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.671676 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe335f8-8a53-40c6-99ca-d106d01d65f5-kube-api-access-rs6x2" (OuterVolumeSpecName: "kube-api-access-rs6x2") pod "6fe335f8-8a53-40c6-99ca-d106d01d65f5" (UID: "6fe335f8-8a53-40c6-99ca-d106d01d65f5"). InnerVolumeSpecName "kube-api-access-rs6x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.705312 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fe335f8-8a53-40c6-99ca-d106d01d65f5" (UID: "6fe335f8-8a53-40c6-99ca-d106d01d65f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.706252 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-config" (OuterVolumeSpecName: "config") pod "6fe335f8-8a53-40c6-99ca-d106d01d65f5" (UID: "6fe335f8-8a53-40c6-99ca-d106d01d65f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.709343 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fe335f8-8a53-40c6-99ca-d106d01d65f5" (UID: "6fe335f8-8a53-40c6-99ca-d106d01d65f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.720557 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fe335f8-8a53-40c6-99ca-d106d01d65f5" (UID: "6fe335f8-8a53-40c6-99ca-d106d01d65f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.728041 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6fe335f8-8a53-40c6-99ca-d106d01d65f5" (UID: "6fe335f8-8a53-40c6-99ca-d106d01d65f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.749161 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.749199 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.749209 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.749219 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.749228 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs6x2\" (UniqueName: \"kubernetes.io/projected/6fe335f8-8a53-40c6-99ca-d106d01d65f5-kube-api-access-rs6x2\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:29 crc kubenswrapper[4762]: I0217 14:30:29.749239 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe335f8-8a53-40c6-99ca-d106d01d65f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:30 crc kubenswrapper[4762]: I0217 14:30:30.010462 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" Feb 17 14:30:30 crc kubenswrapper[4762]: I0217 14:30:30.010466 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" event={"ID":"6fe335f8-8a53-40c6-99ca-d106d01d65f5","Type":"ContainerDied","Data":"daa12be9136315a7ea901928c1b8cf881f724e11a2a357553880e4f4b82d665b"} Feb 17 14:30:30 crc kubenswrapper[4762]: I0217 14:30:30.010595 4762 scope.go:117] "RemoveContainer" containerID="3e9db673b2d22c3ee5af98435d6d8153a2110c9ba0f7085e32fb5322ff6efaf0" Feb 17 14:30:30 crc kubenswrapper[4762]: I0217 14:30:30.095323 4762 scope.go:117] "RemoveContainer" containerID="2044375e66eb74aa89a42d758449bbdffc23deab5ea26f284fe1a52af5696bb4" Feb 17 14:30:30 crc kubenswrapper[4762]: I0217 14:30:30.114947 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-2pthv"] Feb 17 14:30:30 crc kubenswrapper[4762]: I0217 14:30:30.114992 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-2pthv"] Feb 17 14:30:32 crc kubenswrapper[4762]: I0217 14:30:32.043849 4762 generic.go:334] "Generic (PLEG): container finished" podID="8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" containerID="e6e299e92349cffa5cd65ef41d287abc4aa99b44f8b6799fabb9fa73461b3607" exitCode=0 Feb 17 14:30:32 crc kubenswrapper[4762]: I0217 14:30:32.044039 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lq7n6" event={"ID":"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64","Type":"ContainerDied","Data":"e6e299e92349cffa5cd65ef41d287abc4aa99b44f8b6799fabb9fa73461b3607"} Feb 17 14:30:32 crc kubenswrapper[4762]: I0217 14:30:32.048207 4762 generic.go:334] "Generic (PLEG): container finished" podID="53984f9c-be03-44a6-91da-65972a4b4cd5" containerID="f865c92eac1476eafc2c0c30e7afe7ee2571d6f3d907e473e0ff9d179a5c8edf" exitCode=0 Feb 17 14:30:32 crc kubenswrapper[4762]: I0217 14:30:32.048257 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5mknf" event={"ID":"53984f9c-be03-44a6-91da-65972a4b4cd5","Type":"ContainerDied","Data":"f865c92eac1476eafc2c0c30e7afe7ee2571d6f3d907e473e0ff9d179a5c8edf"} Feb 17 14:30:32 crc kubenswrapper[4762]: I0217 14:30:32.104742 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" path="/var/lib/kubelet/pods/6fe335f8-8a53-40c6-99ca-d106d01d65f5/volumes" Feb 17 14:30:34 crc kubenswrapper[4762]: I0217 14:30:34.255944 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-2pthv" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: i/o timeout" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.225511 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.239809 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lq7n6" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.411080 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.438638 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-scripts\") pod \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.438691 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-fernet-keys\") pod \"53984f9c-be03-44a6-91da-65972a4b4cd5\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.438750 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-combined-ca-bundle\") pod \"53984f9c-be03-44a6-91da-65972a4b4cd5\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.443533 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7snz\" (UniqueName: \"kubernetes.io/projected/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-kube-api-access-w7snz\") pod \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.443622 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-combined-ca-bundle\") pod \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.443670 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-credential-keys\") pod \"53984f9c-be03-44a6-91da-65972a4b4cd5\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.443780 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-logs\") pod \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.443902 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-config-data\") pod \"53984f9c-be03-44a6-91da-65972a4b4cd5\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.443935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcftt\" (UniqueName: \"kubernetes.io/projected/53984f9c-be03-44a6-91da-65972a4b4cd5-kube-api-access-zcftt\") pod \"53984f9c-be03-44a6-91da-65972a4b4cd5\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.444057 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-scripts\") pod \"53984f9c-be03-44a6-91da-65972a4b4cd5\" (UID: \"53984f9c-be03-44a6-91da-65972a4b4cd5\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.444119 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-config-data\") pod \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\" (UID: \"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64\") " Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.553438 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53984f9c-be03-44a6-91da-65972a4b4cd5" (UID: "53984f9c-be03-44a6-91da-65972a4b4cd5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.553710 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-kube-api-access-w7snz" (OuterVolumeSpecName: "kube-api-access-w7snz") pod "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" (UID: "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64"). InnerVolumeSpecName "kube-api-access-w7snz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.554605 4762 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.554667 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7snz\" (UniqueName: \"kubernetes.io/projected/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-kube-api-access-w7snz\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.555210 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-logs" (OuterVolumeSpecName: "logs") pod "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" (UID: "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.555384 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53984f9c-be03-44a6-91da-65972a4b4cd5" (UID: "53984f9c-be03-44a6-91da-65972a4b4cd5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.555561 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53984f9c-be03-44a6-91da-65972a4b4cd5-kube-api-access-zcftt" (OuterVolumeSpecName: "kube-api-access-zcftt") pod "53984f9c-be03-44a6-91da-65972a4b4cd5" (UID: "53984f9c-be03-44a6-91da-65972a4b4cd5"). InnerVolumeSpecName "kube-api-access-zcftt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.558998 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-scripts" (OuterVolumeSpecName: "scripts") pod "53984f9c-be03-44a6-91da-65972a4b4cd5" (UID: "53984f9c-be03-44a6-91da-65972a4b4cd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.565211 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.565403 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.568497 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-scripts" (OuterVolumeSpecName: "scripts") pod "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" (UID: "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.660164 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcftt\" (UniqueName: \"kubernetes.io/projected/53984f9c-be03-44a6-91da-65972a4b4cd5-kube-api-access-zcftt\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.660199 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.660210 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.660221 4762 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.660231 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.686105 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-config-data" (OuterVolumeSpecName: "config-data") pod "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" (UID: "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.763024 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.928540 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" (UID: "8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4762]: I0217 14:30:35.967596 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-config-data" (OuterVolumeSpecName: "config-data") pod "53984f9c-be03-44a6-91da-65972a4b4cd5" (UID: "53984f9c-be03-44a6-91da-65972a4b4cd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.133239 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.150002 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53984f9c-be03-44a6-91da-65972a4b4cd5" (UID: "53984f9c-be03-44a6-91da-65972a4b4cd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.164310 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5mknf" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.220245 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lq7n6" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.238352 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.238386 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53984f9c-be03-44a6-91da-65972a4b4cd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.387827 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5mknf" event={"ID":"53984f9c-be03-44a6-91da-65972a4b4cd5","Type":"ContainerDied","Data":"71f7ef78d1a509cea231aa3153f7810fb048ae9cc5eb752b3f5c691f2c15e8eb"} Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.387874 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f7ef78d1a509cea231aa3153f7810fb048ae9cc5eb752b3f5c691f2c15e8eb" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.387922 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.387934 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lq7n6" event={"ID":"8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64","Type":"ContainerDied","Data":"fdb14fa858fb20e0a11d66cce487ff3929657dd1d7d60d1ae2f3b3e5601969c5"} Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.387945 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb14fa858fb20e0a11d66cce487ff3929657dd1d7d60d1ae2f3b3e5601969c5" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.429145 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-86657f9797-7sk9h"] Feb 17 14:30:36 crc kubenswrapper[4762]: E0217 14:30:36.429723 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" containerName="placement-db-sync" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.429737 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" containerName="placement-db-sync" Feb 17 14:30:36 crc kubenswrapper[4762]: E0217 14:30:36.429756 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="init" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.429762 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="init" Feb 17 14:30:36 crc kubenswrapper[4762]: E0217 14:30:36.429769 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bc181f-2e98-4498-9d56-311e015e6086" containerName="collect-profiles" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.429776 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bc181f-2e98-4498-9d56-311e015e6086" containerName="collect-profiles" Feb 17 14:30:36 crc kubenswrapper[4762]: E0217 14:30:36.429788 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="dnsmasq-dns" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.429794 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="dnsmasq-dns" Feb 17 14:30:36 crc kubenswrapper[4762]: E0217 14:30:36.429814 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53984f9c-be03-44a6-91da-65972a4b4cd5" containerName="keystone-bootstrap" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.429819 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="53984f9c-be03-44a6-91da-65972a4b4cd5" containerName="keystone-bootstrap" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.430024 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="53984f9c-be03-44a6-91da-65972a4b4cd5" containerName="keystone-bootstrap" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.430038 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bc181f-2e98-4498-9d56-311e015e6086" containerName="collect-profiles" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.430049 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe335f8-8a53-40c6-99ca-d106d01d65f5" containerName="dnsmasq-dns" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.430062 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" containerName="placement-db-sync" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.430867 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.437192 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.437436 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jgkd7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.437567 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.437862 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.437980 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.438059 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.441995 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86657f9797-7sk9h"] Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.690331 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-config-data\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.695232 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-credential-keys\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.698846 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-scripts\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.698994 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-combined-ca-bundle\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.699060 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9f47cdcfb-z94h7"] Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.699298 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-fernet-keys\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.699783 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvncd\" (UniqueName: \"kubernetes.io/projected/a23de52d-c70a-4f76-b067-cf4fef32b584-kube-api-access-tvncd\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.699944 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-internal-tls-certs\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.699995 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-public-tls-certs\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.701187 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.712396 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.712614 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.712764 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.712945 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sf2vs" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.713112 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.721275 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f47cdcfb-z94h7"] Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802232 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-fernet-keys\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802286 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-combined-ca-bundle\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802319 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvncd\" (UniqueName: \"kubernetes.io/projected/a23de52d-c70a-4f76-b067-cf4fef32b584-kube-api-access-tvncd\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802339 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-internal-tls-certs\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802358 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-public-tls-certs\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-config-data\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802414 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xmk\" (UniqueName: \"kubernetes.io/projected/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-kube-api-access-r2xmk\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-scripts\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802495 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-config-data\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802559 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-public-tls-certs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802609 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-logs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802630 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-credential-keys\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-internal-tls-certs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-scripts\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.802724 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-combined-ca-bundle\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.812538 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-combined-ca-bundle\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.813379 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-fernet-keys\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.815594 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-scripts\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.822741 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-internal-tls-certs\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.825344 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-public-tls-certs\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.841392 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-config-data\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.841772 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a23de52d-c70a-4f76-b067-cf4fef32b584-credential-keys\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.851414 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvncd\" (UniqueName: \"kubernetes.io/projected/a23de52d-c70a-4f76-b067-cf4fef32b584-kube-api-access-tvncd\") pod \"keystone-86657f9797-7sk9h\" (UID: \"a23de52d-c70a-4f76-b067-cf4fef32b584\") " pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909076 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-combined-ca-bundle\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909159 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xmk\" (UniqueName: \"kubernetes.io/projected/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-kube-api-access-r2xmk\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-scripts\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909227 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-config-data\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909292 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-public-tls-certs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909329 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-logs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.909363 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-internal-tls-certs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.917908 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-logs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.921139 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-scripts\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.922532 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-public-tls-certs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.923268 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-config-data\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.924010 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-internal-tls-certs\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.938490 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-combined-ca-bundle\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:36 crc kubenswrapper[4762]: I0217 14:30:36.952295 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xmk\" (UniqueName: \"kubernetes.io/projected/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-kube-api-access-r2xmk\") pod \"placement-9f47cdcfb-z94h7\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.193268 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.197840 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.233538 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-74c5954b4-v4d8z"] Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.236788 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.260221 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-74c5954b4-v4d8z"] Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.282024 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smktq" event={"ID":"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1","Type":"ContainerStarted","Data":"3fb17ebbd8e146f643a15b507ad009691f75a0af1f916266e833930bfdc95b3a"} Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.302205 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h7qp8" event={"ID":"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3","Type":"ContainerStarted","Data":"17aab810c353d27f1546f39fc1e9219e77f96483a29332f4c8a4803d99560833"} Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.370233 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-smktq" podStartSLOduration=8.784986246999999 podStartE2EDuration="1m7.370208821s" podCreationTimestamp="2026-02-17 14:29:30 +0000 UTC" firstStartedPulling="2026-02-17 14:29:36.675153211 +0000 UTC m=+1457.255153863" lastFinishedPulling="2026-02-17 14:30:35.260375785 +0000 UTC m=+1515.840376437" observedRunningTime="2026-02-17 14:30:37.339890297 +0000 UTC m=+1517.919890949" watchObservedRunningTime="2026-02-17 14:30:37.370208821 +0000 UTC m=+1517.950209473" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.392701 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerStarted","Data":"e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3"} Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.393290 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-h7qp8" podStartSLOduration=6.843771295 podStartE2EDuration="1m8.393267117s" podCreationTimestamp="2026-02-17 14:29:29 +0000 UTC" firstStartedPulling="2026-02-17 14:29:33.711761067 +0000 UTC m=+1454.291761719" lastFinishedPulling="2026-02-17 14:30:35.261256889 +0000 UTC m=+1515.841257541" observedRunningTime="2026-02-17 14:30:37.390216934 +0000 UTC m=+1517.970217586" watchObservedRunningTime="2026-02-17 14:30:37.393267117 +0000 UTC m=+1517.973267769" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.400672 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-95lkq" event={"ID":"d6ea0210-709e-4a47-87d1-48c811c0ab85","Type":"ContainerStarted","Data":"c6759c99c71e5d3d5fe8cf99a1ee57341afec410927c40befc9081b3cbae7a1e"} Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.446487 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-public-tls-certs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.447787 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-internal-tls-certs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.448135 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-config-data\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.448294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-scripts\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.448461 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnrc\" (UniqueName: \"kubernetes.io/projected/c64547d6-018c-4123-9017-3f5ef64949b2-kube-api-access-mmnrc\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.448743 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64547d6-018c-4123-9017-3f5ef64949b2-logs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.448988 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-combined-ca-bundle\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.450761 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-95lkq" podStartSLOduration=9.07058357 podStartE2EDuration="1m8.450744649s" podCreationTimestamp="2026-02-17 14:29:29 +0000 UTC" firstStartedPulling="2026-02-17 14:29:35.893193428 +0000 UTC m=+1456.473194080" lastFinishedPulling="2026-02-17 14:30:35.273354507 +0000 UTC m=+1515.853355159" observedRunningTime="2026-02-17 14:30:37.431390803 +0000 UTC m=+1518.011391465" watchObservedRunningTime="2026-02-17 14:30:37.450744649 +0000 UTC m=+1518.030745301" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644412 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64547d6-018c-4123-9017-3f5ef64949b2-logs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-combined-ca-bundle\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-public-tls-certs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644584 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-internal-tls-certs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644690 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-config-data\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644732 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-scripts\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.644769 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnrc\" (UniqueName: \"kubernetes.io/projected/c64547d6-018c-4123-9017-3f5ef64949b2-kube-api-access-mmnrc\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.645828 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c64547d6-018c-4123-9017-3f5ef64949b2-logs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.675885 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-internal-tls-certs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.676128 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-scripts\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.678731 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnrc\" (UniqueName: \"kubernetes.io/projected/c64547d6-018c-4123-9017-3f5ef64949b2-kube-api-access-mmnrc\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.682896 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-public-tls-certs\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.683390 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-combined-ca-bundle\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.683483 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c64547d6-018c-4123-9017-3f5ef64949b2-config-data\") pod \"placement-74c5954b4-v4d8z\" (UID: \"c64547d6-018c-4123-9017-3f5ef64949b2\") " pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:37 crc kubenswrapper[4762]: I0217 14:30:37.923285 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:38 crc kubenswrapper[4762]: I0217 14:30:38.391154 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f47cdcfb-z94h7"] Feb 17 14:30:38 crc kubenswrapper[4762]: I0217 14:30:38.467538 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86657f9797-7sk9h"] Feb 17 14:30:39 crc kubenswrapper[4762]: I0217 14:30:39.370207 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-74c5954b4-v4d8z"] Feb 17 14:30:39 crc kubenswrapper[4762]: I0217 14:30:39.533904 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f47cdcfb-z94h7" event={"ID":"f1d38ad5-c049-4efe-b9c2-a52e54ebff80","Type":"ContainerStarted","Data":"cdab68fc6343a968244b7f29f859576c366cb98df02dc7e9dfd38fb1a11553de"} Feb 17 14:30:39 crc kubenswrapper[4762]: I0217 14:30:39.533951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f47cdcfb-z94h7" event={"ID":"f1d38ad5-c049-4efe-b9c2-a52e54ebff80","Type":"ContainerStarted","Data":"1e89929ca4a392de8b6214e0633686b4c6f8eab3965e4ef008dd4967670e1344"} Feb 17 14:30:39 crc kubenswrapper[4762]: I0217 14:30:39.537347 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-74c5954b4-v4d8z" event={"ID":"c64547d6-018c-4123-9017-3f5ef64949b2","Type":"ContainerStarted","Data":"cda3e523ec6ae686908bd882ced6337052a72ee4765e10f2f9996ecf0c73eeb0"} Feb 17 14:30:39 crc kubenswrapper[4762]: I0217 14:30:39.548069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86657f9797-7sk9h" event={"ID":"a23de52d-c70a-4f76-b067-cf4fef32b584","Type":"ContainerStarted","Data":"7278d89251cee6ee6d8a0cef7fc01f3f4e38b11e8db8a42e509e206b77a6670d"} Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.569296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-74c5954b4-v4d8z" event={"ID":"c64547d6-018c-4123-9017-3f5ef64949b2","Type":"ContainerStarted","Data":"8ce45d067d67164435435674dfedf2f58b3e6f435072d6722cbd996b12224ffe"} Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.574719 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86657f9797-7sk9h" event={"ID":"a23de52d-c70a-4f76-b067-cf4fef32b584","Type":"ContainerStarted","Data":"f41e9ba27a961d6fae471865071fc39af6c81786fe7116bf0a61acb0f5dd948b"} Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.575888 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.592368 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f47cdcfb-z94h7" event={"ID":"f1d38ad5-c049-4efe-b9c2-a52e54ebff80","Type":"ContainerStarted","Data":"6d98430e1f94464289bc63fa02da9dc080caacde8e8b1a23b7ac7a5be99b5372"} Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.593358 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.593608 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.619555 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-86657f9797-7sk9h" podStartSLOduration=4.619529093 podStartE2EDuration="4.619529093s" podCreationTimestamp="2026-02-17 14:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:40.615048391 +0000 UTC m=+1521.195049043" watchObservedRunningTime="2026-02-17 14:30:40.619529093 +0000 UTC m=+1521.199529745" Feb 17 14:30:40 crc kubenswrapper[4762]: I0217 14:30:40.682390 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9f47cdcfb-z94h7" podStartSLOduration=4.68236479 podStartE2EDuration="4.68236479s" podCreationTimestamp="2026-02-17 14:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:40.660349332 +0000 UTC m=+1521.240349984" watchObservedRunningTime="2026-02-17 14:30:40.68236479 +0000 UTC m=+1521.262365442" Feb 17 14:30:41 crc kubenswrapper[4762]: I0217 14:30:41.607537 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-74c5954b4-v4d8z" event={"ID":"c64547d6-018c-4123-9017-3f5ef64949b2","Type":"ContainerStarted","Data":"55e71d55003af49ac78c3c8a5f51bed80ed1e785a6787cf49f6f615dc9b8de06"} Feb 17 14:30:41 crc kubenswrapper[4762]: I0217 14:30:41.607869 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:41 crc kubenswrapper[4762]: I0217 14:30:41.608435 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:30:41 crc kubenswrapper[4762]: I0217 14:30:41.652992 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-74c5954b4-v4d8z" podStartSLOduration=5.652970428 podStartE2EDuration="5.652970428s" podCreationTimestamp="2026-02-17 14:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:41.627202288 +0000 UTC m=+1522.207202960" watchObservedRunningTime="2026-02-17 14:30:41.652970428 +0000 UTC m=+1522.232971070" Feb 17 14:30:42 crc kubenswrapper[4762]: I0217 14:30:42.212192 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:44 crc kubenswrapper[4762]: I0217 14:30:44.716188 4762 generic.go:334] "Generic (PLEG): container finished" podID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" containerID="3fb17ebbd8e146f643a15b507ad009691f75a0af1f916266e833930bfdc95b3a" exitCode=0 Feb 17 14:30:44 crc kubenswrapper[4762]: I0217 14:30:44.716791 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smktq" event={"ID":"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1","Type":"ContainerDied","Data":"3fb17ebbd8e146f643a15b507ad009691f75a0af1f916266e833930bfdc95b3a"} Feb 17 14:30:48 crc kubenswrapper[4762]: I0217 14:30:48.045433 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-675485888-d9mtx" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:30:48 crc kubenswrapper[4762]: I0217 14:30:48.046407 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-675485888-d9mtx" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:30:48 crc kubenswrapper[4762]: I0217 14:30:48.046540 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-675485888-d9mtx" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:30:49 crc kubenswrapper[4762]: I0217 14:30:49.074821 4762 generic.go:334] "Generic (PLEG): container finished" podID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" containerID="17aab810c353d27f1546f39fc1e9219e77f96483a29332f4c8a4803d99560833" exitCode=0 Feb 17 14:30:49 crc kubenswrapper[4762]: I0217 14:30:49.074902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h7qp8" event={"ID":"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3","Type":"ContainerDied","Data":"17aab810c353d27f1546f39fc1e9219e77f96483a29332f4c8a4803d99560833"} Feb 17 14:30:49 crc kubenswrapper[4762]: I0217 14:30:49.079790 4762 generic.go:334] "Generic (PLEG): container finished" podID="d6ea0210-709e-4a47-87d1-48c811c0ab85" containerID="c6759c99c71e5d3d5fe8cf99a1ee57341afec410927c40befc9081b3cbae7a1e" exitCode=0 Feb 17 14:30:49 crc kubenswrapper[4762]: I0217 14:30:49.079830 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-95lkq" event={"ID":"d6ea0210-709e-4a47-87d1-48c811c0ab85","Type":"ContainerDied","Data":"c6759c99c71e5d3d5fe8cf99a1ee57341afec410927c40befc9081b3cbae7a1e"} Feb 17 14:30:50 crc kubenswrapper[4762]: I0217 14:30:50.467098 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:30:50 crc kubenswrapper[4762]: I0217 14:30:50.580577 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-675485888-d9mtx"] Feb 17 14:30:50 crc kubenswrapper[4762]: I0217 14:30:50.583255 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-675485888-d9mtx" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-api" containerID="cri-o://7eb572168b3935d9726979198fa16470637b31e2930463f584f0deeb0929710b" gracePeriod=30 Feb 17 14:30:50 crc kubenswrapper[4762]: I0217 14:30:50.584556 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-675485888-d9mtx" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" containerID="cri-o://8a1cf66aff096f324fbf95108d237110a5c977b8cb857a0ed48d96fbd625213d" gracePeriod=30 Feb 17 14:30:50 crc kubenswrapper[4762]: I0217 14:30:50.595948 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-675485888-d9mtx" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.197:9696/\": EOF" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.004847 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-558c556c77-d2tbn"] Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.007499 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.024774 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-558c556c77-d2tbn"] Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.056904 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-internal-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.056965 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9rr\" (UniqueName: \"kubernetes.io/projected/af765db9-bd7e-4747-8269-49a27c5f0dc6-kube-api-access-tk9rr\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.057016 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-public-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.057044 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-ovndb-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.057129 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-httpd-config\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.057206 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-combined-ca-bundle\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.057228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-config\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.109427 4762 generic.go:334] "Generic (PLEG): container finished" podID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerID="8a1cf66aff096f324fbf95108d237110a5c977b8cb857a0ed48d96fbd625213d" exitCode=0 Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.109481 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675485888-d9mtx" event={"ID":"ee2eb703-bf85-475a-8fea-fca5c7930dd1","Type":"ContainerDied","Data":"8a1cf66aff096f324fbf95108d237110a5c977b8cb857a0ed48d96fbd625213d"} Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159370 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-httpd-config\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159524 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-combined-ca-bundle\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159556 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-config\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159768 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-internal-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159835 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9rr\" (UniqueName: \"kubernetes.io/projected/af765db9-bd7e-4747-8269-49a27c5f0dc6-kube-api-access-tk9rr\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159892 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-public-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.159950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-ovndb-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.167204 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-ovndb-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.167406 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-combined-ca-bundle\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.169618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-internal-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.171249 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-config\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.174770 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-public-tls-certs\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.175740 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af765db9-bd7e-4747-8269-49a27c5f0dc6-httpd-config\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.181345 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9rr\" (UniqueName: \"kubernetes.io/projected/af765db9-bd7e-4747-8269-49a27c5f0dc6-kube-api-access-tk9rr\") pod \"neutron-558c556c77-d2tbn\" (UID: \"af765db9-bd7e-4747-8269-49a27c5f0dc6\") " pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:51 crc kubenswrapper[4762]: I0217 14:30:51.356783 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.025264 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-95lkq" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.034168 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smktq" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.073664 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h7qp8" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.131361 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-combined-ca-bundle\") pod \"d6ea0210-709e-4a47-87d1-48c811c0ab85\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.131466 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6ea0210-709e-4a47-87d1-48c811c0ab85-etc-machine-id\") pod \"d6ea0210-709e-4a47-87d1-48c811c0ab85\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.131634 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-config-data\") pod \"d6ea0210-709e-4a47-87d1-48c811c0ab85\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.131787 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-scripts\") pod \"d6ea0210-709e-4a47-87d1-48c811c0ab85\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.131809 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrmjt\" (UniqueName: \"kubernetes.io/projected/d6ea0210-709e-4a47-87d1-48c811c0ab85-kube-api-access-lrmjt\") pod \"d6ea0210-709e-4a47-87d1-48c811c0ab85\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.131842 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-db-sync-config-data\") pod \"d6ea0210-709e-4a47-87d1-48c811c0ab85\" (UID: \"d6ea0210-709e-4a47-87d1-48c811c0ab85\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.133137 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-95lkq" event={"ID":"d6ea0210-709e-4a47-87d1-48c811c0ab85","Type":"ContainerDied","Data":"13d60409a852050d074383c44514d04956a2cf3fe81d23caad70f81fadf9f8f3"} Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.133170 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d60409a852050d074383c44514d04956a2cf3fe81d23caad70f81fadf9f8f3" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.133227 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-95lkq" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.135178 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6ea0210-709e-4a47-87d1-48c811c0ab85-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d6ea0210-709e-4a47-87d1-48c811c0ab85" (UID: "d6ea0210-709e-4a47-87d1-48c811c0ab85"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.136478 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smktq" event={"ID":"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1","Type":"ContainerDied","Data":"969796ab12ea8175a5a692ef56eb31d465b47c897c75995370e429effdbfad68"} Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.136515 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969796ab12ea8175a5a692ef56eb31d465b47c897c75995370e429effdbfad68" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.136577 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smktq" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.138674 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-scripts" (OuterVolumeSpecName: "scripts") pod "d6ea0210-709e-4a47-87d1-48c811c0ab85" (UID: "d6ea0210-709e-4a47-87d1-48c811c0ab85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.140007 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ea0210-709e-4a47-87d1-48c811c0ab85-kube-api-access-lrmjt" (OuterVolumeSpecName: "kube-api-access-lrmjt") pod "d6ea0210-709e-4a47-87d1-48c811c0ab85" (UID: "d6ea0210-709e-4a47-87d1-48c811c0ab85"). InnerVolumeSpecName "kube-api-access-lrmjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.141160 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h7qp8" event={"ID":"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3","Type":"ContainerDied","Data":"f1cb6d2599641f1ecb30bbc8c92a196820b493f5dbba104ea486b3f88b03dc72"} Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.141191 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1cb6d2599641f1ecb30bbc8c92a196820b493f5dbba104ea486b3f88b03dc72" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.141258 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h7qp8" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.145256 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d6ea0210-709e-4a47-87d1-48c811c0ab85" (UID: "d6ea0210-709e-4a47-87d1-48c811c0ab85"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.167365 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6ea0210-709e-4a47-87d1-48c811c0ab85" (UID: "d6ea0210-709e-4a47-87d1-48c811c0ab85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.191336 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-config-data" (OuterVolumeSpecName: "config-data") pod "d6ea0210-709e-4a47-87d1-48c811c0ab85" (UID: "d6ea0210-709e-4a47-87d1-48c811c0ab85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.233225 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6n47\" (UniqueName: \"kubernetes.io/projected/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-kube-api-access-t6n47\") pod \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.233443 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-combined-ca-bundle\") pod \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.233485 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4t2\" (UniqueName: \"kubernetes.io/projected/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-kube-api-access-wz4t2\") pod \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.233519 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-config-data\") pod \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.233548 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-db-sync-config-data\") pod \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\" (UID: \"a9c276b7-cca9-42c7-8605-5f2bfa0da0e1\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.233565 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-combined-ca-bundle\") pod \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\" (UID: \"8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3\") " Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.234183 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.234200 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.234209 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrmjt\" (UniqueName: \"kubernetes.io/projected/d6ea0210-709e-4a47-87d1-48c811c0ab85-kube-api-access-lrmjt\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.234220 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.234228 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ea0210-709e-4a47-87d1-48c811c0ab85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.234236 4762 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6ea0210-709e-4a47-87d1-48c811c0ab85-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.241822 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-kube-api-access-t6n47" (OuterVolumeSpecName: "kube-api-access-t6n47") pod "a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" (UID: "a9c276b7-cca9-42c7-8605-5f2bfa0da0e1"). InnerVolumeSpecName "kube-api-access-t6n47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.242271 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-kube-api-access-wz4t2" (OuterVolumeSpecName: "kube-api-access-wz4t2") pod "8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" (UID: "8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3"). InnerVolumeSpecName "kube-api-access-wz4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.243344 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" (UID: "a9c276b7-cca9-42c7-8605-5f2bfa0da0e1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.500750 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz4t2\" (UniqueName: \"kubernetes.io/projected/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-kube-api-access-wz4t2\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.500790 4762 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.500800 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6n47\" (UniqueName: \"kubernetes.io/projected/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-kube-api-access-t6n47\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.507966 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" (UID: "a9c276b7-cca9-42c7-8605-5f2bfa0da0e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.517391 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" (UID: "8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.599867 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-config-data" (OuterVolumeSpecName: "config-data") pod "8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" (UID: "8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.602588 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.602920 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:52 crc kubenswrapper[4762]: I0217 14:30:52.603291 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.503707 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-67d8dd69f-j2ffh"] Feb 17 14:30:53 crc kubenswrapper[4762]: E0217 14:30:53.504785 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ea0210-709e-4a47-87d1-48c811c0ab85" containerName="cinder-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.504801 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ea0210-709e-4a47-87d1-48c811c0ab85" containerName="cinder-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: E0217 14:30:53.504834 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" containerName="barbican-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.504840 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" containerName="barbican-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: E0217 14:30:53.504854 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" containerName="heat-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.504864 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" containerName="heat-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.505107 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" containerName="heat-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.505130 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" containerName="barbican-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.505142 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ea0210-709e-4a47-87d1-48c811c0ab85" containerName="cinder-db-sync" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.506446 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.513248 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.513482 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-clgpv" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.514060 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.553159 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-665f7bf56b-7d7wz"] Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.555762 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.569555 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.902690 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a887bb10-111b-4b5e-b2fc-c204129ff11c-logs\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.902850 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l88jf\" (UniqueName: \"kubernetes.io/projected/a887bb10-111b-4b5e-b2fc-c204129ff11c-kube-api-access-l88jf\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.902932 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a51610-1744-455d-beff-2204a3452e61-logs\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.902990 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-combined-ca-bundle\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.903091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-config-data\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.903214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-combined-ca-bundle\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.917241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kfb\" (UniqueName: \"kubernetes.io/projected/f6a51610-1744-455d-beff-2204a3452e61-kube-api-access-b7kfb\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.917460 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-config-data-custom\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.917539 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-config-data\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.917570 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-config-data-custom\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:53 crc kubenswrapper[4762]: I0217 14:30:53.945843 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67d8dd69f-j2ffh"] Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037484 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-config-data-custom\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037551 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-config-data\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037585 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-config-data-custom\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037665 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a887bb10-111b-4b5e-b2fc-c204129ff11c-logs\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037707 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l88jf\" (UniqueName: \"kubernetes.io/projected/a887bb10-111b-4b5e-b2fc-c204129ff11c-kube-api-access-l88jf\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037743 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a51610-1744-455d-beff-2204a3452e61-logs\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037761 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-combined-ca-bundle\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-config-data\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-combined-ca-bundle\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.037888 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kfb\" (UniqueName: \"kubernetes.io/projected/f6a51610-1744-455d-beff-2204a3452e61-kube-api-access-b7kfb\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.053891 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a887bb10-111b-4b5e-b2fc-c204129ff11c-logs\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.054330 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a51610-1744-455d-beff-2204a3452e61-logs\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.066781 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-combined-ca-bundle\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.072197 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-config-data\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.074098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-combined-ca-bundle\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.100121 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-config-data\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.108438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a887bb10-111b-4b5e-b2fc-c204129ff11c-config-data-custom\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.117618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l88jf\" (UniqueName: \"kubernetes.io/projected/a887bb10-111b-4b5e-b2fc-c204129ff11c-kube-api-access-l88jf\") pod \"barbican-worker-67d8dd69f-j2ffh\" (UID: \"a887bb10-111b-4b5e-b2fc-c204129ff11c\") " pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.125065 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kfb\" (UniqueName: \"kubernetes.io/projected/f6a51610-1744-455d-beff-2204a3452e61-kube-api-access-b7kfb\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.129778 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a51610-1744-455d-beff-2204a3452e61-config-data-custom\") pod \"barbican-keystone-listener-665f7bf56b-7d7wz\" (UID: \"f6a51610-1744-455d-beff-2204a3452e61\") " pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.162170 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-665f7bf56b-7d7wz"] Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.173040 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67d8dd69f-j2ffh" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.199306 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" Feb 17 14:30:54 crc kubenswrapper[4762]: E0217 14:30:54.626026 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.626464 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.626493 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.786613 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerStarted","Data":"fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b"} Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.787203 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="ceilometer-notification-agent" containerID="cri-o://0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430" gracePeriod=30 Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.787556 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.788041 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="proxy-httpd" containerID="cri-o://fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b" gracePeriod=30 Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.788115 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="sg-core" containerID="cri-o://e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3" gracePeriod=30 Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.900930 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-5lnvj"] Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.903254 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.922363 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-5lnvj"] Feb 17 14:30:54 crc kubenswrapper[4762]: I0217 14:30:54.946758 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-756fc9c9d4-786zt"] Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.253530 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.271365 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.308930 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-756fc9c9d4-786zt"] Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.340525 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.376393 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.376493 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.376598 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-combined-ca-bundle\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.376890 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data-custom\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.376956 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.377029 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.377229 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.377314 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-config\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.377344 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjxt\" (UniqueName: \"kubernetes.io/projected/81febbb2-748e-4ca9-a7aa-279aed792ffa-kube-api-access-dfjxt\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.377404 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tqjg\" (UniqueName: \"kubernetes.io/projected/8c6f4988-c24b-4424-b07a-bd066359ed2b-kube-api-access-8tqjg\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.377481 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81febbb2-748e-4ca9-a7aa-279aed792ffa-logs\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.403607 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.416418 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.416686 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.416934 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hcfzc" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.447113 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486419 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486470 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data-custom\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486516 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486546 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486606 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/649724f9-1014-4a15-a289-f82f67e420dd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486648 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-config\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjxt\" (UniqueName: \"kubernetes.io/projected/81febbb2-748e-4ca9-a7aa-279aed792ffa-kube-api-access-dfjxt\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486750 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486770 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-scripts\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tqjg\" (UniqueName: \"kubernetes.io/projected/8c6f4988-c24b-4424-b07a-bd066359ed2b-kube-api-access-8tqjg\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81febbb2-748e-4ca9-a7aa-279aed792ffa-logs\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486850 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486874 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486935 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-combined-ca-bundle\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.486950 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/649724f9-1014-4a15-a289-f82f67e420dd-kube-api-access-gqbgk\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.491907 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.500240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.500926 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81febbb2-748e-4ca9-a7aa-279aed792ffa-logs\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.501727 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.501728 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.505376 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-config\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.527594 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.562014 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data-custom\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.578152 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tqjg\" (UniqueName: \"kubernetes.io/projected/8c6f4988-c24b-4424-b07a-bd066359ed2b-kube-api-access-8tqjg\") pod \"dnsmasq-dns-848cf88cfc-5lnvj\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.589155 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/649724f9-1014-4a15-a289-f82f67e420dd-kube-api-access-gqbgk\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.589212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.589360 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/649724f9-1014-4a15-a289-f82f67e420dd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.589466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.589498 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-scripts\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.589569 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.595871 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/649724f9-1014-4a15-a289-f82f67e420dd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.626174 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjxt\" (UniqueName: \"kubernetes.io/projected/81febbb2-748e-4ca9-a7aa-279aed792ffa-kube-api-access-dfjxt\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.631326 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-scripts\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.631694 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.632809 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.633421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.643578 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-5lnvj"] Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.644918 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.906028 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.919957 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/649724f9-1014-4a15-a289-f82f67e420dd-kube-api-access-gqbgk\") pod \"cinder-scheduler-0\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " pod="openstack/cinder-scheduler-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.940226 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-combined-ca-bundle\") pod \"barbican-api-756fc9c9d4-786zt\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.940412 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.943177 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.956533 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 14:30:55 crc kubenswrapper[4762]: I0217 14:30:55.996354 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.019438 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-scripts\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.049199 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62789108-d496-46e9-a85d-d00e3c4cb407-logs\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.049253 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.049274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszhr\" (UniqueName: \"kubernetes.io/projected/62789108-d496-46e9-a85d-d00e3c4cb407-kube-api-access-tszhr\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.049294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data-custom\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.049372 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.049493 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62789108-d496-46e9-a85d-d00e3c4cb407-etc-machine-id\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.030336 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558c556c77-d2tbn" event={"ID":"af765db9-bd7e-4747-8269-49a27c5f0dc6","Type":"ContainerStarted","Data":"5c3a0e45aeec478b308decafa1d9a310b026436c86e653ad118255d462c45287"} Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.068505 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zmxjz"] Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.071589 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.094709 4762 generic.go:334] "Generic (PLEG): container finished" podID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerID="e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3" exitCode=2 Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.099088 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152099 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152235 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-scripts\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cwdb\" (UniqueName: \"kubernetes.io/projected/65bff6fa-f7aa-4b40-ae05-169a575e6096-kube-api-access-2cwdb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152398 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62789108-d496-46e9-a85d-d00e3c4cb407-logs\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152430 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152452 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tszhr\" (UniqueName: \"kubernetes.io/projected/62789108-d496-46e9-a85d-d00e3c4cb407-kube-api-access-tszhr\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152479 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data-custom\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152578 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152620 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-svc\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152687 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-config\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152755 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62789108-d496-46e9-a85d-d00e3c4cb407-etc-machine-id\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.152806 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.155499 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62789108-d496-46e9-a85d-d00e3c4cb407-etc-machine-id\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.160185 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62789108-d496-46e9-a85d-d00e3c4cb407-logs\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.485055 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerDied","Data":"e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3"} Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.485143 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.486492 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.486582 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-svc\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.486725 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-config\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.486822 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.486966 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.487212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cwdb\" (UniqueName: \"kubernetes.io/projected/65bff6fa-f7aa-4b40-ae05-169a575e6096-kube-api-access-2cwdb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.487423 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.487900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.488313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.488485 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.488606 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-svc\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.493965 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-scripts\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.498900 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zmxjz"] Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.513713 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-config\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.523855 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-558c556c77-d2tbn"] Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.528549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszhr\" (UniqueName: \"kubernetes.io/projected/62789108-d496-46e9-a85d-d00e3c4cb407-kube-api-access-tszhr\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.535667 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.536371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data-custom\") pod \"cinder-api-0\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.557540 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cwdb\" (UniqueName: \"kubernetes.io/projected/65bff6fa-f7aa-4b40-ae05-169a575e6096-kube-api-access-2cwdb\") pod \"dnsmasq-dns-6578955fd5-zmxjz\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:56 crc kubenswrapper[4762]: I0217 14:30:56.669275 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:30:56 crc kubenswrapper[4762]: W0217 14:30:56.725183 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a51610_1744_455d_beff_2204a3452e61.slice/crio-5377c42c8197b5680104f421440401f1de612dbe27fecc79a7977f6b756c41da WatchSource:0}: Error finding container 5377c42c8197b5680104f421440401f1de612dbe27fecc79a7977f6b756c41da: Status 404 returned error can't find the container with id 5377c42c8197b5680104f421440401f1de612dbe27fecc79a7977f6b756c41da Feb 17 14:30:57 crc kubenswrapper[4762]: I0217 14:30:57.090170 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:30:57 crc kubenswrapper[4762]: I0217 14:30:57.117314 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="3fe6d960-8cae-47d2-86e7-c077f0facaae" containerName="galera" probeResult="failure" output="command timed out" Feb 17 14:30:57 crc kubenswrapper[4762]: I0217 14:30:57.168885 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" event={"ID":"f6a51610-1744-455d-beff-2204a3452e61","Type":"ContainerStarted","Data":"5377c42c8197b5680104f421440401f1de612dbe27fecc79a7977f6b756c41da"} Feb 17 14:30:57 crc kubenswrapper[4762]: I0217 14:30:57.267909 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-665f7bf56b-7d7wz"] Feb 17 14:30:57 crc kubenswrapper[4762]: I0217 14:30:57.622251 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67d8dd69f-j2ffh"] Feb 17 14:30:58 crc kubenswrapper[4762]: I0217 14:30:58.455383 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558c556c77-d2tbn" event={"ID":"af765db9-bd7e-4747-8269-49a27c5f0dc6","Type":"ContainerStarted","Data":"134340dc14856665ad80ed396e887adacfab8dd7347784d0870627c957187c71"} Feb 17 14:30:58 crc kubenswrapper[4762]: I0217 14:30:58.455743 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67d8dd69f-j2ffh" event={"ID":"a887bb10-111b-4b5e-b2fc-c204129ff11c","Type":"ContainerStarted","Data":"50cb8894c7421dead902542d7af8581e34ef75ce2a7dc1777e3650963790c89c"} Feb 17 14:30:58 crc kubenswrapper[4762]: I0217 14:30:58.591767 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-5lnvj"] Feb 17 14:30:58 crc kubenswrapper[4762]: I0217 14:30:58.609154 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-756fc9c9d4-786zt"] Feb 17 14:30:59 crc kubenswrapper[4762]: I0217 14:30:59.732382 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756fc9c9d4-786zt" event={"ID":"81febbb2-748e-4ca9-a7aa-279aed792ffa","Type":"ContainerStarted","Data":"0b5e643c3d05469b963433da6f2279c22b43d1c00a9880905791b06503aa0011"} Feb 17 14:30:59 crc kubenswrapper[4762]: I0217 14:30:59.736636 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" event={"ID":"8c6f4988-c24b-4424-b07a-bd066359ed2b","Type":"ContainerStarted","Data":"b8ea5ce5178ce9a4be2805829c0b74154adf19c393f256953a58d735461cf0ef"} Feb 17 14:30:59 crc kubenswrapper[4762]: I0217 14:30:59.758555 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.006385 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.136574 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zmxjz"] Feb 17 14:31:00 crc kubenswrapper[4762]: W0217 14:31:00.166993 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65bff6fa_f7aa_4b40_ae05_169a575e6096.slice/crio-2db46896d334f0e74452a92b99c92527d0e4cc01e446e52a5f7078fda797892b WatchSource:0}: Error finding container 2db46896d334f0e74452a92b99c92527d0e4cc01e446e52a5f7078fda797892b: Status 404 returned error can't find the container with id 2db46896d334f0e74452a92b99c92527d0e4cc01e446e52a5f7078fda797892b Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.881282 4762 generic.go:334] "Generic (PLEG): container finished" podID="8c6f4988-c24b-4424-b07a-bd066359ed2b" containerID="629338c72e7b49b6525cebb635e6aad326ce5f9d7979708a9bcb4831aba90f42" exitCode=0 Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.881505 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" event={"ID":"8c6f4988-c24b-4424-b07a-bd066359ed2b","Type":"ContainerDied","Data":"629338c72e7b49b6525cebb635e6aad326ce5f9d7979708a9bcb4831aba90f42"} Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.889429 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62789108-d496-46e9-a85d-d00e3c4cb407","Type":"ContainerStarted","Data":"251eccff5c753e67e6e55d07601deda64a575a274199020e4970e7938059ff31"} Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.911118 4762 generic.go:334] "Generic (PLEG): container finished" podID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerID="0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430" exitCode=0 Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.911237 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerDied","Data":"0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430"} Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.929877 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" event={"ID":"65bff6fa-f7aa-4b40-ae05-169a575e6096","Type":"ContainerStarted","Data":"2db46896d334f0e74452a92b99c92527d0e4cc01e446e52a5f7078fda797892b"} Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.937402 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"649724f9-1014-4a15-a289-f82f67e420dd","Type":"ContainerStarted","Data":"1e8f7576bdb5614a2334ed2eebedc86a7b4e37e374216554c3dd86a1e47a07aa"} Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.949963 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558c556c77-d2tbn" event={"ID":"af765db9-bd7e-4747-8269-49a27c5f0dc6","Type":"ContainerStarted","Data":"6fb8ac46a62043788c2050ae9296880b44eeb9ba2f6c4bfa9eca35fcb516d624"} Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.950285 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:31:00 crc kubenswrapper[4762]: I0217 14:31:00.981476 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-558c556c77-d2tbn" podStartSLOduration=10.98143123 podStartE2EDuration="10.98143123s" podCreationTimestamp="2026-02-17 14:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:00.976944398 +0000 UTC m=+1541.556945060" watchObservedRunningTime="2026-02-17 14:31:00.98143123 +0000 UTC m=+1541.561431892" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.315952 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.323241 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.381185 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-svc\") pod \"8c6f4988-c24b-4424-b07a-bd066359ed2b\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.381389 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-swift-storage-0\") pod \"8c6f4988-c24b-4424-b07a-bd066359ed2b\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.381429 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-nb\") pod \"8c6f4988-c24b-4424-b07a-bd066359ed2b\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.381530 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-config\") pod \"8c6f4988-c24b-4424-b07a-bd066359ed2b\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.381569 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tqjg\" (UniqueName: \"kubernetes.io/projected/8c6f4988-c24b-4424-b07a-bd066359ed2b-kube-api-access-8tqjg\") pod \"8c6f4988-c24b-4424-b07a-bd066359ed2b\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.381750 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-sb\") pod \"8c6f4988-c24b-4424-b07a-bd066359ed2b\" (UID: \"8c6f4988-c24b-4424-b07a-bd066359ed2b\") " Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.432862 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6f4988-c24b-4424-b07a-bd066359ed2b-kube-api-access-8tqjg" (OuterVolumeSpecName: "kube-api-access-8tqjg") pod "8c6f4988-c24b-4424-b07a-bd066359ed2b" (UID: "8c6f4988-c24b-4424-b07a-bd066359ed2b"). InnerVolumeSpecName "kube-api-access-8tqjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.485043 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tqjg\" (UniqueName: \"kubernetes.io/projected/8c6f4988-c24b-4424-b07a-bd066359ed2b-kube-api-access-8tqjg\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.536878 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c6f4988-c24b-4424-b07a-bd066359ed2b" (UID: "8c6f4988-c24b-4424-b07a-bd066359ed2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.551513 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-config" (OuterVolumeSpecName: "config") pod "8c6f4988-c24b-4424-b07a-bd066359ed2b" (UID: "8c6f4988-c24b-4424-b07a-bd066359ed2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.581253 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c6f4988-c24b-4424-b07a-bd066359ed2b" (UID: "8c6f4988-c24b-4424-b07a-bd066359ed2b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.591310 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.591827 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.591900 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.602541 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c6f4988-c24b-4424-b07a-bd066359ed2b" (UID: "8c6f4988-c24b-4424-b07a-bd066359ed2b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.604278 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c6f4988-c24b-4424-b07a-bd066359ed2b" (UID: "8c6f4988-c24b-4424-b07a-bd066359ed2b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.694472 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.694588 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6f4988-c24b-4424-b07a-bd066359ed2b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.989280 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756fc9c9d4-786zt" event={"ID":"81febbb2-748e-4ca9-a7aa-279aed792ffa","Type":"ContainerStarted","Data":"df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d"} Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.989590 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756fc9c9d4-786zt" event={"ID":"81febbb2-748e-4ca9-a7aa-279aed792ffa","Type":"ContainerStarted","Data":"5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43"} Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.989680 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:31:01 crc kubenswrapper[4762]: I0217 14:31:01.989709 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.255183 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.265263 4762 generic.go:334] "Generic (PLEG): container finished" podID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerID="80f2662feae74d8b54a324a35f9f3dee6b653f1f6a0420e7070729dac06143a7" exitCode=0 Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.269240 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-5lnvj" event={"ID":"8c6f4988-c24b-4424-b07a-bd066359ed2b","Type":"ContainerDied","Data":"b8ea5ce5178ce9a4be2805829c0b74154adf19c393f256953a58d735461cf0ef"} Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.273699 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" event={"ID":"65bff6fa-f7aa-4b40-ae05-169a575e6096","Type":"ContainerDied","Data":"80f2662feae74d8b54a324a35f9f3dee6b653f1f6a0420e7070729dac06143a7"} Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.273791 4762 scope.go:117] "RemoveContainer" containerID="629338c72e7b49b6525cebb635e6aad326ce5f9d7979708a9bcb4831aba90f42" Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.311305 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-756fc9c9d4-786zt" podStartSLOduration=8.311281768 podStartE2EDuration="8.311281768s" podCreationTimestamp="2026-02-17 14:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:02.256082598 +0000 UTC m=+1542.836083250" watchObservedRunningTime="2026-02-17 14:31:02.311281768 +0000 UTC m=+1542.891282410" Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.397690 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-5lnvj"] Feb 17 14:31:02 crc kubenswrapper[4762]: I0217 14:31:02.431409 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-5lnvj"] Feb 17 14:31:03 crc kubenswrapper[4762]: I0217 14:31:03.560384 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62789108-d496-46e9-a85d-d00e3c4cb407","Type":"ContainerStarted","Data":"0e90131a756794f43460e008fa6b22fcbcdaf1612ceab184bd0858cb7e334981"} Feb 17 14:31:04 crc kubenswrapper[4762]: I0217 14:31:04.098900 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6f4988-c24b-4424-b07a-bd066359ed2b" path="/var/lib/kubelet/pods/8c6f4988-c24b-4424-b07a-bd066359ed2b/volumes" Feb 17 14:31:04 crc kubenswrapper[4762]: I0217 14:31:04.673796 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"649724f9-1014-4a15-a289-f82f67e420dd","Type":"ContainerStarted","Data":"766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc"} Feb 17 14:31:04 crc kubenswrapper[4762]: I0217 14:31:04.828396 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:31:04 crc kubenswrapper[4762]: I0217 14:31:04.854021 4762 generic.go:334] "Generic (PLEG): container finished" podID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerID="7eb572168b3935d9726979198fa16470637b31e2930463f584f0deeb0929710b" exitCode=0 Feb 17 14:31:04 crc kubenswrapper[4762]: I0217 14:31:04.854079 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675485888-d9mtx" event={"ID":"ee2eb703-bf85-475a-8fea-fca5c7930dd1","Type":"ContainerDied","Data":"7eb572168b3935d9726979198fa16470637b31e2930463f584f0deeb0929710b"} Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.346576 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f7475d794-g4jpc"] Feb 17 14:31:05 crc kubenswrapper[4762]: E0217 14:31:05.348873 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6f4988-c24b-4424-b07a-bd066359ed2b" containerName="init" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.348908 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6f4988-c24b-4424-b07a-bd066359ed2b" containerName="init" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.349276 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6f4988-c24b-4424-b07a-bd066359ed2b" containerName="init" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.351171 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.355618 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.355712 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.364888 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f7475d794-g4jpc"] Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.440606 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq8wh\" (UniqueName: \"kubernetes.io/projected/dafb15f9-f633-4acc-a69f-6199b20ae0e7-kube-api-access-jq8wh\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.440730 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dafb15f9-f633-4acc-a69f-6199b20ae0e7-logs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.440770 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-combined-ca-bundle\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.440789 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-internal-tls-certs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.440929 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-config-data\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.441001 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-config-data-custom\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.441116 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-public-tls-certs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-public-tls-certs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543708 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq8wh\" (UniqueName: \"kubernetes.io/projected/dafb15f9-f633-4acc-a69f-6199b20ae0e7-kube-api-access-jq8wh\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543765 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dafb15f9-f633-4acc-a69f-6199b20ae0e7-logs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-combined-ca-bundle\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543825 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-internal-tls-certs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-config-data\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.543968 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-config-data-custom\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.544616 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dafb15f9-f633-4acc-a69f-6199b20ae0e7-logs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.548607 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-internal-tls-certs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.558043 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-config-data-custom\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.548371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-public-tls-certs\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.559459 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-combined-ca-bundle\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.571182 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq8wh\" (UniqueName: \"kubernetes.io/projected/dafb15f9-f633-4acc-a69f-6199b20ae0e7-kube-api-access-jq8wh\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.588033 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafb15f9-f633-4acc-a69f-6199b20ae0e7-config-data\") pod \"barbican-api-5f7475d794-g4jpc\" (UID: \"dafb15f9-f633-4acc-a69f-6199b20ae0e7\") " pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:05 crc kubenswrapper[4762]: I0217 14:31:05.672174 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.335034 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675485888-d9mtx" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.438583 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-httpd-config\") pod \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.438772 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-ovndb-tls-certs\") pod \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.438912 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwsd7\" (UniqueName: \"kubernetes.io/projected/ee2eb703-bf85-475a-8fea-fca5c7930dd1-kube-api-access-pwsd7\") pod \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.439211 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-combined-ca-bundle\") pod \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.439250 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-config\") pod \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\" (UID: \"ee2eb703-bf85-475a-8fea-fca5c7930dd1\") " Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.465014 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ee2eb703-bf85-475a-8fea-fca5c7930dd1" (UID: "ee2eb703-bf85-475a-8fea-fca5c7930dd1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.470810 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2eb703-bf85-475a-8fea-fca5c7930dd1-kube-api-access-pwsd7" (OuterVolumeSpecName: "kube-api-access-pwsd7") pod "ee2eb703-bf85-475a-8fea-fca5c7930dd1" (UID: "ee2eb703-bf85-475a-8fea-fca5c7930dd1"). InnerVolumeSpecName "kube-api-access-pwsd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.533063 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-config" (OuterVolumeSpecName: "config") pod "ee2eb703-bf85-475a-8fea-fca5c7930dd1" (UID: "ee2eb703-bf85-475a-8fea-fca5c7930dd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.556333 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwsd7\" (UniqueName: \"kubernetes.io/projected/ee2eb703-bf85-475a-8fea-fca5c7930dd1-kube-api-access-pwsd7\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.556370 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.556380 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.577964 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ee2eb703-bf85-475a-8fea-fca5c7930dd1" (UID: "ee2eb703-bf85-475a-8fea-fca5c7930dd1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.617619 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee2eb703-bf85-475a-8fea-fca5c7930dd1" (UID: "ee2eb703-bf85-475a-8fea-fca5c7930dd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.619141 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f7475d794-g4jpc"] Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.658409 4762 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.658447 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2eb703-bf85-475a-8fea-fca5c7930dd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:06 crc kubenswrapper[4762]: W0217 14:31:06.689516 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddafb15f9_f633_4acc_a69f_6199b20ae0e7.slice/crio-82e81809240c01643a9f5e09f1aa21b46f8e0438e6ebe74027111762a93c54e9 WatchSource:0}: Error finding container 82e81809240c01643a9f5e09f1aa21b46f8e0438e6ebe74027111762a93c54e9: Status 404 returned error can't find the container with id 82e81809240c01643a9f5e09f1aa21b46f8e0438e6ebe74027111762a93c54e9 Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.911275 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" event={"ID":"f6a51610-1744-455d-beff-2204a3452e61","Type":"ContainerStarted","Data":"a0eef836dad2d83e4310a8bfbbfdf2774a41fbde7012f9a2e266f92677904654"} Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.914055 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675485888-d9mtx" event={"ID":"ee2eb703-bf85-475a-8fea-fca5c7930dd1","Type":"ContainerDied","Data":"e016c781ba3daa1a33a35740fe8ca67eeaee1607cd632100618f4f5dff090392"} Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.914103 4762 scope.go:117] "RemoveContainer" containerID="8a1cf66aff096f324fbf95108d237110a5c977b8cb857a0ed48d96fbd625213d" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.914142 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675485888-d9mtx" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.917487 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f7475d794-g4jpc" event={"ID":"dafb15f9-f633-4acc-a69f-6199b20ae0e7","Type":"ContainerStarted","Data":"82e81809240c01643a9f5e09f1aa21b46f8e0438e6ebe74027111762a93c54e9"} Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.921896 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67d8dd69f-j2ffh" event={"ID":"a887bb10-111b-4b5e-b2fc-c204129ff11c","Type":"ContainerStarted","Data":"5698f56980157f3e6f223566229ed641105b1d4a3b4903c211cbfded2f50934f"} Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.934636 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" event={"ID":"65bff6fa-f7aa-4b40-ae05-169a575e6096","Type":"ContainerStarted","Data":"93c59150e6f56455566c0992cf1e3e192dfdc61550db8c1d7bbc64ab523ef0db"} Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.935204 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.958244 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-675485888-d9mtx"] Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.974760 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-675485888-d9mtx"] Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.988166 4762 scope.go:117] "RemoveContainer" containerID="7eb572168b3935d9726979198fa16470637b31e2930463f584f0deeb0929710b" Feb 17 14:31:06 crc kubenswrapper[4762]: I0217 14:31:06.998537 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" podStartSLOduration=12.998514533 podStartE2EDuration="12.998514533s" podCreationTimestamp="2026-02-17 14:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:06.978355975 +0000 UTC m=+1547.558356627" watchObservedRunningTime="2026-02-17 14:31:06.998514533 +0000 UTC m=+1547.578515185" Feb 17 14:31:07 crc kubenswrapper[4762]: I0217 14:31:07.973715 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"649724f9-1014-4a15-a289-f82f67e420dd","Type":"ContainerStarted","Data":"0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7"} Feb 17 14:31:07 crc kubenswrapper[4762]: I0217 14:31:07.999751 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f7475d794-g4jpc" event={"ID":"dafb15f9-f633-4acc-a69f-6199b20ae0e7","Type":"ContainerStarted","Data":"bba6c22040c2e6d5aefc4372d37c383da6c41c9803c944e7aa3d8e58c6b8ddae"} Feb 17 14:31:07 crc kubenswrapper[4762]: I0217 14:31:07.999809 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f7475d794-g4jpc" event={"ID":"dafb15f9-f633-4acc-a69f-6199b20ae0e7","Type":"ContainerStarted","Data":"0bf37e45e68cfcf2c49de24318a5dd86b9e943d5bd6ea195af6986957e29a1f7"} Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.000196 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.000234 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.001289 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=12.559230239 podStartE2EDuration="14.001272303s" podCreationTimestamp="2026-02-17 14:30:54 +0000 UTC" firstStartedPulling="2026-02-17 14:30:59.775825519 +0000 UTC m=+1540.355826171" lastFinishedPulling="2026-02-17 14:31:01.217867583 +0000 UTC m=+1541.797868235" observedRunningTime="2026-02-17 14:31:07.998336803 +0000 UTC m=+1548.578337455" watchObservedRunningTime="2026-02-17 14:31:08.001272303 +0000 UTC m=+1548.581272955" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.007618 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67d8dd69f-j2ffh" event={"ID":"a887bb10-111b-4b5e-b2fc-c204129ff11c","Type":"ContainerStarted","Data":"f0a9442065c96387ad90b2e509f45cd70b632ae448c27ef860e8ca8ea031708c"} Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.019676 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62789108-d496-46e9-a85d-d00e3c4cb407","Type":"ContainerStarted","Data":"e27ba2cde044c5472c2a52457ac666f92df587ba3ff15ec4a5891ed6194d7446"} Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.019879 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api-log" containerID="cri-o://0e90131a756794f43460e008fa6b22fcbcdaf1612ceab184bd0858cb7e334981" gracePeriod=30 Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.019971 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.020008 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api" containerID="cri-o://e27ba2cde044c5472c2a52457ac666f92df587ba3ff15ec4a5891ed6194d7446" gracePeriod=30 Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.031734 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f7475d794-g4jpc" podStartSLOduration=3.03170917 podStartE2EDuration="3.03170917s" podCreationTimestamp="2026-02-17 14:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:08.0287612 +0000 UTC m=+1548.608761872" watchObservedRunningTime="2026-02-17 14:31:08.03170917 +0000 UTC m=+1548.611709832" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.051210 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" event={"ID":"f6a51610-1744-455d-beff-2204a3452e61","Type":"ContainerStarted","Data":"9d2380908f1114cae7de1bff4a3ae270cc8e1912f20019df7f18b11ac5200e82"} Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.090809 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" path="/var/lib/kubelet/pods/ee2eb703-bf85-475a-8fea-fca5c7930dd1/volumes" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.109785 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=14.10975877 podStartE2EDuration="14.10975877s" podCreationTimestamp="2026-02-17 14:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:08.059016732 +0000 UTC m=+1548.639017414" watchObservedRunningTime="2026-02-17 14:31:08.10975877 +0000 UTC m=+1548.689759412" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.152068 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-67d8dd69f-j2ffh" podStartSLOduration=7.116254966 podStartE2EDuration="15.152042209s" podCreationTimestamp="2026-02-17 14:30:53 +0000 UTC" firstStartedPulling="2026-02-17 14:30:57.744203247 +0000 UTC m=+1538.324203899" lastFinishedPulling="2026-02-17 14:31:05.77999049 +0000 UTC m=+1546.359991142" observedRunningTime="2026-02-17 14:31:08.083157958 +0000 UTC m=+1548.663158610" watchObservedRunningTime="2026-02-17 14:31:08.152042209 +0000 UTC m=+1548.732042861" Feb 17 14:31:08 crc kubenswrapper[4762]: I0217 14:31:08.188244 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-665f7bf56b-7d7wz" podStartSLOduration=6.680994892 podStartE2EDuration="15.188217862s" podCreationTimestamp="2026-02-17 14:30:53 +0000 UTC" firstStartedPulling="2026-02-17 14:30:57.199007676 +0000 UTC m=+1537.779008328" lastFinishedPulling="2026-02-17 14:31:05.706230646 +0000 UTC m=+1546.286231298" observedRunningTime="2026-02-17 14:31:08.111118667 +0000 UTC m=+1548.691119319" watchObservedRunningTime="2026-02-17 14:31:08.188217862 +0000 UTC m=+1548.768218514" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.079542 4762 generic.go:334] "Generic (PLEG): container finished" podID="62789108-d496-46e9-a85d-d00e3c4cb407" containerID="e27ba2cde044c5472c2a52457ac666f92df587ba3ff15ec4a5891ed6194d7446" exitCode=0 Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.079896 4762 generic.go:334] "Generic (PLEG): container finished" podID="62789108-d496-46e9-a85d-d00e3c4cb407" containerID="0e90131a756794f43460e008fa6b22fcbcdaf1612ceab184bd0858cb7e334981" exitCode=143 Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.079761 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62789108-d496-46e9-a85d-d00e3c4cb407","Type":"ContainerDied","Data":"e27ba2cde044c5472c2a52457ac666f92df587ba3ff15ec4a5891ed6194d7446"} Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.081024 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62789108-d496-46e9-a85d-d00e3c4cb407","Type":"ContainerDied","Data":"0e90131a756794f43460e008fa6b22fcbcdaf1612ceab184bd0858cb7e334981"} Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.518661 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.639666 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data-custom\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.639918 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62789108-d496-46e9-a85d-d00e3c4cb407-logs\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640036 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62789108-d496-46e9-a85d-d00e3c4cb407-etc-machine-id\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640063 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640102 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tszhr\" (UniqueName: \"kubernetes.io/projected/62789108-d496-46e9-a85d-d00e3c4cb407-kube-api-access-tszhr\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640151 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-combined-ca-bundle\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640175 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-scripts\") pod \"62789108-d496-46e9-a85d-d00e3c4cb407\" (UID: \"62789108-d496-46e9-a85d-d00e3c4cb407\") " Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640264 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62789108-d496-46e9-a85d-d00e3c4cb407-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.640885 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62789108-d496-46e9-a85d-d00e3c4cb407-logs" (OuterVolumeSpecName: "logs") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.642741 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62789108-d496-46e9-a85d-d00e3c4cb407-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.642770 4762 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62789108-d496-46e9-a85d-d00e3c4cb407-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.648953 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-scripts" (OuterVolumeSpecName: "scripts") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.665865 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.667847 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62789108-d496-46e9-a85d-d00e3c4cb407-kube-api-access-tszhr" (OuterVolumeSpecName: "kube-api-access-tszhr") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "kube-api-access-tszhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.746246 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tszhr\" (UniqueName: \"kubernetes.io/projected/62789108-d496-46e9-a85d-d00e3c4cb407-kube-api-access-tszhr\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.746280 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.746289 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.804054 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data" (OuterVolumeSpecName: "config-data") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.852327 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.855816 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62789108-d496-46e9-a85d-d00e3c4cb407" (UID: "62789108-d496-46e9-a85d-d00e3c4cb407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.946045 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.951140 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:31:09 crc kubenswrapper[4762]: I0217 14:31:09.954304 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62789108-d496-46e9-a85d-d00e3c4cb407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.123756 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.124216 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"62789108-d496-46e9-a85d-d00e3c4cb407","Type":"ContainerDied","Data":"251eccff5c753e67e6e55d07601deda64a575a274199020e4970e7938059ff31"} Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.124468 4762 scope.go:117] "RemoveContainer" containerID="e27ba2cde044c5472c2a52457ac666f92df587ba3ff15ec4a5891ed6194d7446" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.173942 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.193466 4762 scope.go:117] "RemoveContainer" containerID="0e90131a756794f43460e008fa6b22fcbcdaf1612ceab184bd0858cb7e334981" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.204879 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.235702 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:10 crc kubenswrapper[4762]: E0217 14:31:10.236411 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api-log" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236431 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api-log" Feb 17 14:31:10 crc kubenswrapper[4762]: E0217 14:31:10.236443 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-api" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236452 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-api" Feb 17 14:31:10 crc kubenswrapper[4762]: E0217 14:31:10.236490 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236498 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" Feb 17 14:31:10 crc kubenswrapper[4762]: E0217 14:31:10.236529 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236537 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236836 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-httpd" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236872 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236884 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2eb703-bf85-475a-8fea-fca5c7930dd1" containerName="neutron-api" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.236896 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" containerName="cinder-api-log" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.239084 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.246132 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.246193 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.246398 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.265208 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.271098 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:31:10 crc kubenswrapper[4762]: E0217 14:31:10.288170 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62789108_d496_46e9_a85d_d00e3c4cb407.slice/crio-251eccff5c753e67e6e55d07601deda64a575a274199020e4970e7938059ff31\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62789108_d496_46e9_a85d_d00e3c4cb407.slice\": RecentStats: unable to find data in memory cache]" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362192 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e58addf-d172-4f09-b4e5-30b62cafb801-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362252 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e58addf-d172-4f09-b4e5-30b62cafb801-logs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362291 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362311 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-scripts\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362362 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362399 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-config-data\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362431 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362572 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5xt5\" (UniqueName: \"kubernetes.io/projected/1e58addf-d172-4f09-b4e5-30b62cafb801-kube-api-access-k5xt5\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.362600 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-config-data-custom\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.396363 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.472336 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.472431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-scripts\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.472883 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.473027 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-config-data\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.473136 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.473725 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5xt5\" (UniqueName: \"kubernetes.io/projected/1e58addf-d172-4f09-b4e5-30b62cafb801-kube-api-access-k5xt5\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.473806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-config-data-custom\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.473969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e58addf-d172-4f09-b4e5-30b62cafb801-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.474012 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e58addf-d172-4f09-b4e5-30b62cafb801-logs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.474578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e58addf-d172-4f09-b4e5-30b62cafb801-logs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.474926 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e58addf-d172-4f09-b4e5-30b62cafb801-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.493293 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.493436 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.494244 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-scripts\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.495425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-config-data\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.498368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5xt5\" (UniqueName: \"kubernetes.io/projected/1e58addf-d172-4f09-b4e5-30b62cafb801-kube-api-access-k5xt5\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.506374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-config-data-custom\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.512343 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e58addf-d172-4f09-b4e5-30b62cafb801-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1e58addf-d172-4f09-b4e5-30b62cafb801\") " pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.526385 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-74c5954b4-v4d8z" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.572632 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.676337 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9f47cdcfb-z94h7"] Feb 17 14:31:10 crc kubenswrapper[4762]: I0217 14:31:10.900127 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-86657f9797-7sk9h" Feb 17 14:31:11 crc kubenswrapper[4762]: I0217 14:31:11.100430 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 14:31:11 crc kubenswrapper[4762]: I0217 14:31:11.108094 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.207:8080/\": dial tcp 10.217.0.207:8080: connect: connection refused" Feb 17 14:31:11 crc kubenswrapper[4762]: I0217 14:31:11.141918 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-9f47cdcfb-z94h7" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-log" containerID="cri-o://cdab68fc6343a968244b7f29f859576c366cb98df02dc7e9dfd38fb1a11553de" gracePeriod=30 Feb 17 14:31:11 crc kubenswrapper[4762]: I0217 14:31:11.142440 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-9f47cdcfb-z94h7" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-api" containerID="cri-o://6d98430e1f94464289bc63fa02da9dc080caacde8e8b1a23b7ac7a5be99b5372" gracePeriod=30 Feb 17 14:31:11 crc kubenswrapper[4762]: I0217 14:31:11.333968 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.104336 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62789108-d496-46e9-a85d-d00e3c4cb407" path="/var/lib/kubelet/pods/62789108-d496-46e9-a85d-d00e3c4cb407/volumes" Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.107522 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.274634 4762 generic.go:334] "Generic (PLEG): container finished" podID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerID="cdab68fc6343a968244b7f29f859576c366cb98df02dc7e9dfd38fb1a11553de" exitCode=143 Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.274709 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f47cdcfb-z94h7" event={"ID":"f1d38ad5-c049-4efe-b9c2-a52e54ebff80","Type":"ContainerDied","Data":"cdab68fc6343a968244b7f29f859576c366cb98df02dc7e9dfd38fb1a11553de"} Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.276065 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1e58addf-d172-4f09-b4e5-30b62cafb801","Type":"ContainerStarted","Data":"19bcf42b8f0a29b4299d24f4288eb470b4da971b063812397cd26eceeea65c32"} Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.276098 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1e58addf-d172-4f09-b4e5-30b62cafb801","Type":"ContainerStarted","Data":"45026e7348e73a319805b0e215cee3087135219868c159d0f4617968b743b275"} Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.281544 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-gd7pw"] Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.281811 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="dnsmasq-dns" containerID="cri-o://01c3bfedbbdda822752c16fbf30ea475f2a4e991d8289023001b4761f36dc674" gracePeriod=10 Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.737683 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.196:5353: connect: connection refused" Feb 17 14:31:12 crc kubenswrapper[4762]: I0217 14:31:12.978396 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.355676 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" event={"ID":"8befecb9-4510-4921-a212-e80a8b832855","Type":"ContainerDied","Data":"01c3bfedbbdda822752c16fbf30ea475f2a4e991d8289023001b4761f36dc674"} Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.372371 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.348635 4762 generic.go:334] "Generic (PLEG): container finished" podID="8befecb9-4510-4921-a212-e80a8b832855" containerID="01c3bfedbbdda822752c16fbf30ea475f2a4e991d8289023001b4761f36dc674" exitCode=0 Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.376803 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" event={"ID":"8befecb9-4510-4921-a212-e80a8b832855","Type":"ContainerDied","Data":"e63c95946f220211e49d9be2e6985955101adc0cd48c0a262fc88dded9dff330"} Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.376860 4762 scope.go:117] "RemoveContainer" containerID="01c3bfedbbdda822752c16fbf30ea475f2a4e991d8289023001b4761f36dc674" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.443847 4762 scope.go:117] "RemoveContainer" containerID="005c50eaea1c444d6f0b66c6862777bbe57b02af1edba0414efc1c5441023635" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.485927 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-config\") pod \"8befecb9-4510-4921-a212-e80a8b832855\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.486004 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-nb\") pod \"8befecb9-4510-4921-a212-e80a8b832855\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.486097 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-swift-storage-0\") pod \"8befecb9-4510-4921-a212-e80a8b832855\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.486123 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-svc\") pod \"8befecb9-4510-4921-a212-e80a8b832855\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.486149 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqwph\" (UniqueName: \"kubernetes.io/projected/8befecb9-4510-4921-a212-e80a8b832855-kube-api-access-lqwph\") pod \"8befecb9-4510-4921-a212-e80a8b832855\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.486187 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-sb\") pod \"8befecb9-4510-4921-a212-e80a8b832855\" (UID: \"8befecb9-4510-4921-a212-e80a8b832855\") " Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.774902 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8befecb9-4510-4921-a212-e80a8b832855-kube-api-access-lqwph" (OuterVolumeSpecName: "kube-api-access-lqwph") pod "8befecb9-4510-4921-a212-e80a8b832855" (UID: "8befecb9-4510-4921-a212-e80a8b832855"). InnerVolumeSpecName "kube-api-access-lqwph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.802341 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8befecb9-4510-4921-a212-e80a8b832855" (UID: "8befecb9-4510-4921-a212-e80a8b832855"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.806149 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.806184 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqwph\" (UniqueName: \"kubernetes.io/projected/8befecb9-4510-4921-a212-e80a8b832855-kube-api-access-lqwph\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.811538 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8befecb9-4510-4921-a212-e80a8b832855" (UID: "8befecb9-4510-4921-a212-e80a8b832855"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.851462 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8befecb9-4510-4921-a212-e80a8b832855" (UID: "8befecb9-4510-4921-a212-e80a8b832855"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.852292 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8befecb9-4510-4921-a212-e80a8b832855" (UID: "8befecb9-4510-4921-a212-e80a8b832855"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.908305 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.908339 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.908349 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:13 crc kubenswrapper[4762]: I0217 14:31:13.920294 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-config" (OuterVolumeSpecName: "config") pod "8befecb9-4510-4921-a212-e80a8b832855" (UID: "8befecb9-4510-4921-a212-e80a8b832855"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.010862 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8befecb9-4510-4921-a212-e80a8b832855-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.399736 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-gd7pw" Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.408482 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1e58addf-d172-4f09-b4e5-30b62cafb801","Type":"ContainerStarted","Data":"2d64409055087bc7907dd1c989384ef79513912d3787e9674033d4b810026b66"} Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.408768 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.426371 4762 generic.go:334] "Generic (PLEG): container finished" podID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerID="6d98430e1f94464289bc63fa02da9dc080caacde8e8b1a23b7ac7a5be99b5372" exitCode=0 Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.426420 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f47cdcfb-z94h7" event={"ID":"f1d38ad5-c049-4efe-b9c2-a52e54ebff80","Type":"ContainerDied","Data":"6d98430e1f94464289bc63fa02da9dc080caacde8e8b1a23b7ac7a5be99b5372"} Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.437399 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.437369437 podStartE2EDuration="4.437369437s" podCreationTimestamp="2026-02-17 14:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:14.425107145 +0000 UTC m=+1555.005107807" watchObservedRunningTime="2026-02-17 14:31:14.437369437 +0000 UTC m=+1555.017370089" Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.460916 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-gd7pw"] Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.474787 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-gd7pw"] Feb 17 14:31:14 crc kubenswrapper[4762]: I0217 14:31:14.951025 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.042863 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:15 crc kubenswrapper[4762]: E0217 14:31:15.043421 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="dnsmasq-dns" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043449 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="dnsmasq-dns" Feb 17 14:31:15 crc kubenswrapper[4762]: E0217 14:31:15.043500 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-log" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043510 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-log" Feb 17 14:31:15 crc kubenswrapper[4762]: E0217 14:31:15.043528 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="init" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043537 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="init" Feb 17 14:31:15 crc kubenswrapper[4762]: E0217 14:31:15.043582 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-api" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043593 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-api" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043824 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-log" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043846 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" containerName="placement-api" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.043855 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8befecb9-4510-4921-a212-e80a8b832855" containerName="dnsmasq-dns" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.044864 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.051686 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.051992 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.053206 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r7fth" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.071470 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.097954 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-scripts\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.098303 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-combined-ca-bundle\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.098355 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-public-tls-certs\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.098558 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-logs\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.098711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2xmk\" (UniqueName: \"kubernetes.io/projected/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-kube-api-access-r2xmk\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.098785 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-internal-tls-certs\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.098825 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-config-data\") pod \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\" (UID: \"f1d38ad5-c049-4efe-b9c2-a52e54ebff80\") " Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.101289 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-logs" (OuterVolumeSpecName: "logs") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.105002 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.106523 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-scripts" (OuterVolumeSpecName: "scripts") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.112834 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-kube-api-access-r2xmk" (OuterVolumeSpecName: "kube-api-access-r2xmk") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "kube-api-access-r2xmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.186412 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-config-data" (OuterVolumeSpecName: "config-data") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.189198 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.207571 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.209434 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsbxw\" (UniqueName: \"kubernetes.io/projected/9d5884c5-1bca-4205-a246-87e6d4351871-kube-api-access-qsbxw\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.209739 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.210075 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.210424 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2xmk\" (UniqueName: \"kubernetes.io/projected/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-kube-api-access-r2xmk\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.210542 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.210638 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.210832 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.266794 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.277597 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1d38ad5-c049-4efe-b9c2-a52e54ebff80" (UID: "f1d38ad5-c049-4efe-b9c2-a52e54ebff80"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.311916 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsbxw\" (UniqueName: \"kubernetes.io/projected/9d5884c5-1bca-4205-a246-87e6d4351871-kube-api-access-qsbxw\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.311961 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.312047 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.312146 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.312203 4762 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.312213 4762 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1d38ad5-c049-4efe-b9c2-a52e54ebff80-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.313597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.315963 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.316839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.336163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsbxw\" (UniqueName: \"kubernetes.io/projected/9d5884c5-1bca-4205-a246-87e6d4351871-kube-api-access-qsbxw\") pod \"openstackclient\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.363522 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.447145 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f47cdcfb-z94h7" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.448728 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f47cdcfb-z94h7" event={"ID":"f1d38ad5-c049-4efe-b9c2-a52e54ebff80","Type":"ContainerDied","Data":"1e89929ca4a392de8b6214e0633686b4c6f8eab3965e4ef008dd4967670e1344"} Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.448841 4762 scope.go:117] "RemoveContainer" containerID="6d98430e1f94464289bc63fa02da9dc080caacde8e8b1a23b7ac7a5be99b5372" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.800877 4762 scope.go:117] "RemoveContainer" containerID="cdab68fc6343a968244b7f29f859576c366cb98df02dc7e9dfd38fb1a11553de" Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.817720 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9f47cdcfb-z94h7"] Feb 17 14:31:15 crc kubenswrapper[4762]: I0217 14:31:15.863384 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9f47cdcfb-z94h7"] Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.071826 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.096712 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8befecb9-4510-4921-a212-e80a8b832855" path="/var/lib/kubelet/pods/8befecb9-4510-4921-a212-e80a8b832855/volumes" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.098090 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d38ad5-c049-4efe-b9c2-a52e54ebff80" path="/var/lib/kubelet/pods/f1d38ad5-c049-4efe-b9c2-a52e54ebff80/volumes" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.099011 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.238855 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.240428 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.270616 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.418144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dd2323-04a9-409b-b035-7d086e4eaef6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.418214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9dd2323-04a9-409b-b035-7d086e4eaef6-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.418273 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9dd2323-04a9-409b-b035-7d086e4eaef6-openstack-config\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.418329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxz8j\" (UniqueName: \"kubernetes.io/projected/c9dd2323-04a9-409b-b035-7d086e4eaef6-kube-api-access-rxz8j\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.507969 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.520058 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dd2323-04a9-409b-b035-7d086e4eaef6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.520125 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9dd2323-04a9-409b-b035-7d086e4eaef6-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.520189 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9dd2323-04a9-409b-b035-7d086e4eaef6-openstack-config\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.520260 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxz8j\" (UniqueName: \"kubernetes.io/projected/c9dd2323-04a9-409b-b035-7d086e4eaef6-kube-api-access-rxz8j\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.522312 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c9dd2323-04a9-409b-b035-7d086e4eaef6-openstack-config\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.533224 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c9dd2323-04a9-409b-b035-7d086e4eaef6-openstack-config-secret\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.533744 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dd2323-04a9-409b-b035-7d086e4eaef6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.556702 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxz8j\" (UniqueName: \"kubernetes.io/projected/c9dd2323-04a9-409b-b035-7d086e4eaef6-kube-api-access-rxz8j\") pod \"openstackclient\" (UID: \"c9dd2323-04a9-409b-b035-7d086e4eaef6\") " pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.591048 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:16 crc kubenswrapper[4762]: I0217 14:31:16.827162 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:16 crc kubenswrapper[4762]: E0217 14:31:16.967803 4762 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 14:31:16 crc kubenswrapper[4762]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_9d5884c5-1bca-4205-a246-87e6d4351871_0(4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5" Netns:"/var/run/netns/530c0a2d-da70-4a5f-92a2-11235ac0d79f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5;K8S_POD_UID=9d5884c5-1bca-4205-a246-87e6d4351871" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/9d5884c5-1bca-4205-a246-87e6d4351871:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5 network default NAD default] [openstack/openstackclient 4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:d4 [10.217.0.212/23] Feb 17 14:31:16 crc kubenswrapper[4762]: ' Feb 17 14:31:16 crc kubenswrapper[4762]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 14:31:16 crc kubenswrapper[4762]: > Feb 17 14:31:16 crc kubenswrapper[4762]: E0217 14:31:16.967870 4762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 14:31:16 crc kubenswrapper[4762]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_9d5884c5-1bca-4205-a246-87e6d4351871_0(4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5" Netns:"/var/run/netns/530c0a2d-da70-4a5f-92a2-11235ac0d79f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5;K8S_POD_UID=9d5884c5-1bca-4205-a246-87e6d4351871" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/9d5884c5-1bca-4205-a246-87e6d4351871:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5 network default NAD default] [openstack/openstackclient 4a9ce367d157685c6218c6cfaa07104334d8f991c55b7c61bd21e4f3a9f0e8c5 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:d4 [10.217.0.212/23] Feb 17 14:31:16 crc kubenswrapper[4762]: ' Feb 17 14:31:16 crc kubenswrapper[4762]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 14:31:16 crc kubenswrapper[4762]: > pod="openstack/openstackclient" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.541131 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.542440 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="cinder-scheduler" containerID="cri-o://766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc" gracePeriod=30 Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.542923 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="probe" containerID="cri-o://0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7" gracePeriod=30 Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.547100 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9d5884c5-1bca-4205-a246-87e6d4351871" podUID="c9dd2323-04a9-409b-b035-7d086e4eaef6" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.576790 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.658018 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsbxw\" (UniqueName: \"kubernetes.io/projected/9d5884c5-1bca-4205-a246-87e6d4351871-kube-api-access-qsbxw\") pod \"9d5884c5-1bca-4205-a246-87e6d4351871\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.658105 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-combined-ca-bundle\") pod \"9d5884c5-1bca-4205-a246-87e6d4351871\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.658233 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config\") pod \"9d5884c5-1bca-4205-a246-87e6d4351871\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.658393 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config-secret\") pod \"9d5884c5-1bca-4205-a246-87e6d4351871\" (UID: \"9d5884c5-1bca-4205-a246-87e6d4351871\") " Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.666281 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9d5884c5-1bca-4205-a246-87e6d4351871" (UID: "9d5884c5-1bca-4205-a246-87e6d4351871"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.668666 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d5884c5-1bca-4205-a246-87e6d4351871" (UID: "9d5884c5-1bca-4205-a246-87e6d4351871"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.670807 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5884c5-1bca-4205-a246-87e6d4351871-kube-api-access-qsbxw" (OuterVolumeSpecName: "kube-api-access-qsbxw") pod "9d5884c5-1bca-4205-a246-87e6d4351871" (UID: "9d5884c5-1bca-4205-a246-87e6d4351871"). InnerVolumeSpecName "kube-api-access-qsbxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.682620 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9d5884c5-1bca-4205-a246-87e6d4351871" (UID: "9d5884c5-1bca-4205-a246-87e6d4351871"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.763054 4762 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.763089 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsbxw\" (UniqueName: \"kubernetes.io/projected/9d5884c5-1bca-4205-a246-87e6d4351871-kube-api-access-qsbxw\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.763100 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5884c5-1bca-4205-a246-87e6d4351871-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.763111 4762 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d5884c5-1bca-4205-a246-87e6d4351871-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:17 crc kubenswrapper[4762]: I0217 14:31:17.981934 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 14:31:17 crc kubenswrapper[4762]: W0217 14:31:17.997832 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9dd2323_04a9_409b_b035_7d086e4eaef6.slice/crio-fe895e17b678fb0b1779cca4f475ec161242313155281822826ae8036876fb49 WatchSource:0}: Error finding container fe895e17b678fb0b1779cca4f475ec161242313155281822826ae8036876fb49: Status 404 returned error can't find the container with id fe895e17b678fb0b1779cca4f475ec161242313155281822826ae8036876fb49 Feb 17 14:31:18 crc kubenswrapper[4762]: I0217 14:31:18.083764 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5884c5-1bca-4205-a246-87e6d4351871" path="/var/lib/kubelet/pods/9d5884c5-1bca-4205-a246-87e6d4351871/volumes" Feb 17 14:31:18 crc kubenswrapper[4762]: I0217 14:31:18.698307 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:31:18 crc kubenswrapper[4762]: I0217 14:31:18.699244 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c9dd2323-04a9-409b-b035-7d086e4eaef6","Type":"ContainerStarted","Data":"fe895e17b678fb0b1779cca4f475ec161242313155281822826ae8036876fb49"} Feb 17 14:31:18 crc kubenswrapper[4762]: I0217 14:31:18.707570 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9d5884c5-1bca-4205-a246-87e6d4351871" podUID="c9dd2323-04a9-409b-b035-7d086e4eaef6" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.069883 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5bfd9c8d59-mxmfg"] Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.073853 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.076917 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.077104 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.077274 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.108287 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bfd9c8d59-mxmfg"] Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.186094 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-combined-ca-bundle\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.186179 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/849ff889-c3dd-4ae3-b103-b49b6ad2535d-run-httpd\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.188139 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-public-tls-certs\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.188213 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/849ff889-c3dd-4ae3-b103-b49b6ad2535d-log-httpd\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.188274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/849ff889-c3dd-4ae3-b103-b49b6ad2535d-etc-swift\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.188516 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-config-data\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.188566 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-internal-tls-certs\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.188752 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-828vk\" (UniqueName: \"kubernetes.io/projected/849ff889-c3dd-4ae3-b103-b49b6ad2535d-kube-api-access-828vk\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.291681 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-config-data\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.291975 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-internal-tls-certs\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.292075 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-828vk\" (UniqueName: \"kubernetes.io/projected/849ff889-c3dd-4ae3-b103-b49b6ad2535d-kube-api-access-828vk\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.292167 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-combined-ca-bundle\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.292219 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/849ff889-c3dd-4ae3-b103-b49b6ad2535d-run-httpd\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.292276 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-public-tls-certs\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.292307 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/849ff889-c3dd-4ae3-b103-b49b6ad2535d-log-httpd\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.292335 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/849ff889-c3dd-4ae3-b103-b49b6ad2535d-etc-swift\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.293122 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/849ff889-c3dd-4ae3-b103-b49b6ad2535d-run-httpd\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.293415 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/849ff889-c3dd-4ae3-b103-b49b6ad2535d-log-httpd\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.301736 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-public-tls-certs\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.302119 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-internal-tls-certs\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.302738 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-combined-ca-bundle\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.302792 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849ff889-c3dd-4ae3-b103-b49b6ad2535d-config-data\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.312965 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-828vk\" (UniqueName: \"kubernetes.io/projected/849ff889-c3dd-4ae3-b103-b49b6ad2535d-kube-api-access-828vk\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.314468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/849ff889-c3dd-4ae3-b103-b49b6ad2535d-etc-swift\") pod \"swift-proxy-5bfd9c8d59-mxmfg\" (UID: \"849ff889-c3dd-4ae3-b103-b49b6ad2535d\") " pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.401321 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.718962 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.772829 4762 generic.go:334] "Generic (PLEG): container finished" podID="649724f9-1014-4a15-a289-f82f67e420dd" containerID="0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7" exitCode=0 Feb 17 14:31:19 crc kubenswrapper[4762]: I0217 14:31:19.772884 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"649724f9-1014-4a15-a289-f82f67e420dd","Type":"ContainerDied","Data":"0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7"} Feb 17 14:31:20 crc kubenswrapper[4762]: I0217 14:31:20.355588 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f7475d794-g4jpc" Feb 17 14:31:20 crc kubenswrapper[4762]: I0217 14:31:20.461500 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-756fc9c9d4-786zt"] Feb 17 14:31:20 crc kubenswrapper[4762]: I0217 14:31:20.461818 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-756fc9c9d4-786zt" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api-log" containerID="cri-o://df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d" gracePeriod=30 Feb 17 14:31:20 crc kubenswrapper[4762]: I0217 14:31:20.462305 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-756fc9c9d4-786zt" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api" containerID="cri-o://5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43" gracePeriod=30 Feb 17 14:31:20 crc kubenswrapper[4762]: I0217 14:31:20.770400 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bfd9c8d59-mxmfg"] Feb 17 14:31:20 crc kubenswrapper[4762]: I0217 14:31:20.979217 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" event={"ID":"849ff889-c3dd-4ae3-b103-b49b6ad2535d","Type":"ContainerStarted","Data":"ed9bfa0dd72cfaadf9a2831cf0d5fe2f03dfcaa2cfc86f0fdeb12be79d905b6d"} Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.001911 4762 generic.go:334] "Generic (PLEG): container finished" podID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerID="df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d" exitCode=143 Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.002761 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756fc9c9d4-786zt" event={"ID":"81febbb2-748e-4ca9-a7aa-279aed792ffa","Type":"ContainerDied","Data":"df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d"} Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.407556 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-558c556c77-d2tbn" Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.497373 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f47bdcf85-g4f9w"] Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.497694 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f47bdcf85-g4f9w" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-api" containerID="cri-o://40bfadd0be5a49cf632f62cc2d679da6a27b3b7606bb06e8c319ffb998c7a00a" gracePeriod=30 Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.498299 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f47bdcf85-g4f9w" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-httpd" containerID="cri-o://32a94d62c2e7d2a6766a7870466783bc42e46fbe12f626f85b1a7961462224e0" gracePeriod=30 Feb 17 14:31:21 crc kubenswrapper[4762]: I0217 14:31:21.855326 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.281278 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-combined-ca-bundle\") pod \"649724f9-1014-4a15-a289-f82f67e420dd\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.281402 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/649724f9-1014-4a15-a289-f82f67e420dd-kube-api-access-gqbgk\") pod \"649724f9-1014-4a15-a289-f82f67e420dd\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.281504 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data-custom\") pod \"649724f9-1014-4a15-a289-f82f67e420dd\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.281626 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/649724f9-1014-4a15-a289-f82f67e420dd-etc-machine-id\") pod \"649724f9-1014-4a15-a289-f82f67e420dd\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.282414 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-scripts\") pod \"649724f9-1014-4a15-a289-f82f67e420dd\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.282581 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data\") pod \"649724f9-1014-4a15-a289-f82f67e420dd\" (UID: \"649724f9-1014-4a15-a289-f82f67e420dd\") " Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.283863 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/649724f9-1014-4a15-a289-f82f67e420dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "649724f9-1014-4a15-a289-f82f67e420dd" (UID: "649724f9-1014-4a15-a289-f82f67e420dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.293960 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649724f9-1014-4a15-a289-f82f67e420dd-kube-api-access-gqbgk" (OuterVolumeSpecName: "kube-api-access-gqbgk") pod "649724f9-1014-4a15-a289-f82f67e420dd" (UID: "649724f9-1014-4a15-a289-f82f67e420dd"). InnerVolumeSpecName "kube-api-access-gqbgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.338539 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "649724f9-1014-4a15-a289-f82f67e420dd" (UID: "649724f9-1014-4a15-a289-f82f67e420dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.356016 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-scripts" (OuterVolumeSpecName: "scripts") pod "649724f9-1014-4a15-a289-f82f67e420dd" (UID: "649724f9-1014-4a15-a289-f82f67e420dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.393916 4762 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/649724f9-1014-4a15-a289-f82f67e420dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.393948 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.393961 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/649724f9-1014-4a15-a289-f82f67e420dd-kube-api-access-gqbgk\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.393975 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.414027 4762 generic.go:334] "Generic (PLEG): container finished" podID="649724f9-1014-4a15-a289-f82f67e420dd" containerID="766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc" exitCode=0 Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.414155 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.414314 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"649724f9-1014-4a15-a289-f82f67e420dd","Type":"ContainerDied","Data":"766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc"} Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.414348 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"649724f9-1014-4a15-a289-f82f67e420dd","Type":"ContainerDied","Data":"1e8f7576bdb5614a2334ed2eebedc86a7b4e37e374216554c3dd86a1e47a07aa"} Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.414369 4762 scope.go:117] "RemoveContainer" containerID="0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.477972 4762 generic.go:334] "Generic (PLEG): container finished" podID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerID="32a94d62c2e7d2a6766a7870466783bc42e46fbe12f626f85b1a7961462224e0" exitCode=0 Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.478103 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47bdcf85-g4f9w" event={"ID":"922b4fd8-4192-45a2-9fad-c6e49f93e9eb","Type":"ContainerDied","Data":"32a94d62c2e7d2a6766a7870466783bc42e46fbe12f626f85b1a7961462224e0"} Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.493892 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" event={"ID":"849ff889-c3dd-4ae3-b103-b49b6ad2535d","Type":"ContainerStarted","Data":"4881bd3b7a52794d0941e5076768deacc16fb8343a453774d3700076183e88c6"} Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.493956 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" event={"ID":"849ff889-c3dd-4ae3-b103-b49b6ad2535d","Type":"ContainerStarted","Data":"70e5e37c593c5d77cfd243f95767beba200db1ba3b8a4313bcb786ebca189cd9"} Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.494389 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.494739 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.499301 4762 scope.go:117] "RemoveContainer" containerID="766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.565868 4762 scope.go:117] "RemoveContainer" containerID="0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.572483 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "649724f9-1014-4a15-a289-f82f67e420dd" (UID: "649724f9-1014-4a15-a289-f82f67e420dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:22 crc kubenswrapper[4762]: E0217 14:31:22.591624 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7\": container with ID starting with 0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7 not found: ID does not exist" containerID="0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.591741 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7"} err="failed to get container status \"0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7\": rpc error: code = NotFound desc = could not find container \"0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7\": container with ID starting with 0ff31f9f360ba6517f72ed2971c77421498fc9cc61e609f869c187d9db8437f7 not found: ID does not exist" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.591779 4762 scope.go:117] "RemoveContainer" containerID="766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.592826 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" podStartSLOduration=3.592813879 podStartE2EDuration="3.592813879s" podCreationTimestamp="2026-02-17 14:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:22.531871129 +0000 UTC m=+1563.111871801" watchObservedRunningTime="2026-02-17 14:31:22.592813879 +0000 UTC m=+1563.172814531" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.616659 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:22 crc kubenswrapper[4762]: E0217 14:31:22.626285 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc\": container with ID starting with 766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc not found: ID does not exist" containerID="766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.626353 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc"} err="failed to get container status \"766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc\": rpc error: code = NotFound desc = could not find container \"766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc\": container with ID starting with 766282353a0099bacec5323df8cff3521a2e582289af0215dd311eca5f191dcc not found: ID does not exist" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.685075 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data" (OuterVolumeSpecName: "config-data") pod "649724f9-1014-4a15-a289-f82f67e420dd" (UID: "649724f9-1014-4a15-a289-f82f67e420dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.719788 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649724f9-1014-4a15-a289-f82f67e420dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.816797 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.838048 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.889140 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:22 crc kubenswrapper[4762]: E0217 14:31:22.891007 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="cinder-scheduler" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.891143 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="cinder-scheduler" Feb 17 14:31:22 crc kubenswrapper[4762]: E0217 14:31:22.891240 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="probe" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.891377 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="probe" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.891949 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="cinder-scheduler" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.892055 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="649724f9-1014-4a15-a289-f82f67e420dd" containerName="probe" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.893752 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:31:22 crc kubenswrapper[4762]: I0217 14:31:22.898702 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.269425 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.365911 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/048d8d34-8b8e-4267-9747-2db21026d3a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.366043 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fc2\" (UniqueName: \"kubernetes.io/projected/048d8d34-8b8e-4267-9747-2db21026d3a8-kube-api-access-27fc2\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.366154 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.366200 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.366401 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.367167 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.469154 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.469541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/048d8d34-8b8e-4267-9747-2db21026d3a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.469600 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fc2\" (UniqueName: \"kubernetes.io/projected/048d8d34-8b8e-4267-9747-2db21026d3a8-kube-api-access-27fc2\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.469667 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.469703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.469800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.470381 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/048d8d34-8b8e-4267-9747-2db21026d3a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.477935 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.478214 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.479127 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.488371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/048d8d34-8b8e-4267-9747-2db21026d3a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.489203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fc2\" (UniqueName: \"kubernetes.io/projected/048d8d34-8b8e-4267-9747-2db21026d3a8-kube-api-access-27fc2\") pod \"cinder-scheduler-0\" (UID: \"048d8d34-8b8e-4267-9747-2db21026d3a8\") " pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.612632 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.738835 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5f7475d794-g4jpc" podUID="dafb15f9-f633-4acc-a69f-6199b20ae0e7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.210:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.860630 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-756fc9c9d4-786zt" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": read tcp 10.217.0.2:45844->10.217.0.206:9311: read: connection reset by peer" Feb 17 14:31:23 crc kubenswrapper[4762]: I0217 14:31:23.860712 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-756fc9c9d4-786zt" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": read tcp 10.217.0.2:45848->10.217.0.206:9311: read: connection reset by peer" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.097094 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649724f9-1014-4a15-a289-f82f67e420dd" path="/var/lib/kubelet/pods/649724f9-1014-4a15-a289-f82f67e420dd/volumes" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.233708 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.297175 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5x5bg"] Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.330685 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.365722 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x5bg"] Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.478971 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.540327 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data\") pod \"81febbb2-748e-4ca9-a7aa-279aed792ffa\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.540446 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-combined-ca-bundle\") pod \"81febbb2-748e-4ca9-a7aa-279aed792ffa\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.540507 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfjxt\" (UniqueName: \"kubernetes.io/projected/81febbb2-748e-4ca9-a7aa-279aed792ffa-kube-api-access-dfjxt\") pod \"81febbb2-748e-4ca9-a7aa-279aed792ffa\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.540595 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data-custom\") pod \"81febbb2-748e-4ca9-a7aa-279aed792ffa\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.540739 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81febbb2-748e-4ca9-a7aa-279aed792ffa-logs\") pod \"81febbb2-748e-4ca9-a7aa-279aed792ffa\" (UID: \"81febbb2-748e-4ca9-a7aa-279aed792ffa\") " Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.541298 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk76k\" (UniqueName: \"kubernetes.io/projected/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-kube-api-access-wk76k\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.541347 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-catalog-content\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.541384 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-utilities\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.545354 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81febbb2-748e-4ca9-a7aa-279aed792ffa-logs" (OuterVolumeSpecName: "logs") pod "81febbb2-748e-4ca9-a7aa-279aed792ffa" (UID: "81febbb2-748e-4ca9-a7aa-279aed792ffa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.556905 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81febbb2-748e-4ca9-a7aa-279aed792ffa-kube-api-access-dfjxt" (OuterVolumeSpecName: "kube-api-access-dfjxt") pod "81febbb2-748e-4ca9-a7aa-279aed792ffa" (UID: "81febbb2-748e-4ca9-a7aa-279aed792ffa"). InnerVolumeSpecName "kube-api-access-dfjxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.589219 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "81febbb2-748e-4ca9-a7aa-279aed792ffa" (UID: "81febbb2-748e-4ca9-a7aa-279aed792ffa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.594721 4762 generic.go:334] "Generic (PLEG): container finished" podID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerID="5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43" exitCode=0 Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.595047 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756fc9c9d4-786zt" event={"ID":"81febbb2-748e-4ca9-a7aa-279aed792ffa","Type":"ContainerDied","Data":"5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43"} Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.595166 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-756fc9c9d4-786zt" event={"ID":"81febbb2-748e-4ca9-a7aa-279aed792ffa","Type":"ContainerDied","Data":"0b5e643c3d05469b963433da6f2279c22b43d1c00a9880905791b06503aa0011"} Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.595245 4762 scope.go:117] "RemoveContainer" containerID="5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.595452 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-756fc9c9d4-786zt" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.608682 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="1e58addf-d172-4f09-b4e5-30b62cafb801" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.211:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.616498 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"048d8d34-8b8e-4267-9747-2db21026d3a8","Type":"ContainerStarted","Data":"5b2bf66898209dad1a58fea30363112d9cf1ee2e48cc5c11e3833f20844a0862"} Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.626618 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.626703 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.626792 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.628100 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.628193 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" gracePeriod=600 Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.654550 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk76k\" (UniqueName: \"kubernetes.io/projected/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-kube-api-access-wk76k\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.654616 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-catalog-content\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.654682 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-utilities\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.654845 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfjxt\" (UniqueName: \"kubernetes.io/projected/81febbb2-748e-4ca9-a7aa-279aed792ffa-kube-api-access-dfjxt\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.654860 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.654872 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81febbb2-748e-4ca9-a7aa-279aed792ffa-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.655281 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-utilities\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.656672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-catalog-content\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.686023 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data" (OuterVolumeSpecName: "config-data") pod "81febbb2-748e-4ca9-a7aa-279aed792ffa" (UID: "81febbb2-748e-4ca9-a7aa-279aed792ffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.696480 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk76k\" (UniqueName: \"kubernetes.io/projected/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-kube-api-access-wk76k\") pod \"redhat-marketplace-5x5bg\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.715838 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81febbb2-748e-4ca9-a7aa-279aed792ffa" (UID: "81febbb2-748e-4ca9-a7aa-279aed792ffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.727382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.759064 4762 scope.go:117] "RemoveContainer" containerID="df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.761685 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:24 crc kubenswrapper[4762]: I0217 14:31:24.761845 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81febbb2-748e-4ca9-a7aa-279aed792ffa-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:24 crc kubenswrapper[4762]: E0217 14:31:24.859264 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.081952 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-756fc9c9d4-786zt"] Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.098274 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-756fc9c9d4-786zt"] Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.173484 4762 scope.go:117] "RemoveContainer" containerID="5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43" Feb 17 14:31:25 crc kubenswrapper[4762]: E0217 14:31:25.174690 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43\": container with ID starting with 5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43 not found: ID does not exist" containerID="5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.174749 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43"} err="failed to get container status \"5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43\": rpc error: code = NotFound desc = could not find container \"5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43\": container with ID starting with 5367fb593d444fbfe9e2141c0d8534169cf1ce7d6d6b376b4c9bf8390c936d43 not found: ID does not exist" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.174780 4762 scope.go:117] "RemoveContainer" containerID="df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d" Feb 17 14:31:25 crc kubenswrapper[4762]: E0217 14:31:25.175310 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d\": container with ID starting with df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d not found: ID does not exist" containerID="df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.175434 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d"} err="failed to get container status \"df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d\": rpc error: code = NotFound desc = could not find container \"df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d\": container with ID starting with df8152cb3b981252aed1799dd091bf850ed7aa1610534790193442526dfbcf0d not found: ID does not exist" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.524389 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.584809 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1e58addf-d172-4f09-b4e5-30b62cafb801" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.211:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.607169 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x5bg"] Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.626908 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-combined-ca-bundle\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.627079 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-sg-core-conf-yaml\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.627164 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-log-httpd\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.627230 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-kube-api-access-s6z6j\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.627473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-scripts\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.627573 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-config-data\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.627943 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-run-httpd\") pod \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\" (UID: \"a4d225d9-98bc-48c2-94a2-0c74c3f11d89\") " Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.631137 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.635033 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.647854 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.647889 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.657254 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-scripts" (OuterVolumeSpecName: "scripts") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.698897 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-kube-api-access-s6z6j" (OuterVolumeSpecName: "kube-api-access-s6z6j") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "kube-api-access-s6z6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.708007 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" exitCode=0 Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.708105 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46"} Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.708168 4762 scope.go:117] "RemoveContainer" containerID="1f57f792acac65c40f56a21d9846b71db555cf9b18e70e6ffc6202b1c323fd44" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.711009 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:31:25 crc kubenswrapper[4762]: E0217 14:31:25.711821 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.716290 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerStarted","Data":"fb666b90112391b53b4eac87a2636d25dbb4ec3b615ea1a973331fc2b6dc2d49"} Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.755915 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-kube-api-access-s6z6j\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.756262 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.766237 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"048d8d34-8b8e-4267-9747-2db21026d3a8","Type":"ContainerStarted","Data":"62888a05ae69b5d62b92a0a6553dbaf76963a1a7bd26e7c495deccd89f28d09b"} Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.811219 4762 generic.go:334] "Generic (PLEG): container finished" podID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerID="fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b" exitCode=137 Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.811273 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerDied","Data":"fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b"} Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.811301 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d225d9-98bc-48c2-94a2-0c74c3f11d89","Type":"ContainerDied","Data":"d2dce3d6df3d3d924acc24709f937ab62f744b764e99c4ad4f86c384d3d0b733"} Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.811384 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.858776 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.862828 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-config-data" (OuterVolumeSpecName: "config-data") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.886948 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d225d9-98bc-48c2-94a2-0c74c3f11d89" (UID: "a4d225d9-98bc-48c2-94a2-0c74c3f11d89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.961364 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.961409 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.961424 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d225d9-98bc-48c2-94a2-0c74c3f11d89-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:25 crc kubenswrapper[4762]: I0217 14:31:25.994952 4762 scope.go:117] "RemoveContainer" containerID="fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.051963 4762 scope.go:117] "RemoveContainer" containerID="e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.100627 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" path="/var/lib/kubelet/pods/81febbb2-748e-4ca9-a7aa-279aed792ffa/volumes" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.139874 4762 scope.go:117] "RemoveContainer" containerID="0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.229076 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.257832 4762 scope.go:117] "RemoveContainer" containerID="fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.262736 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b\": container with ID starting with fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b not found: ID does not exist" containerID="fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.262774 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b"} err="failed to get container status \"fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b\": rpc error: code = NotFound desc = could not find container \"fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b\": container with ID starting with fa53ff2814fc426993e8d6b7bea585fa0ca0d494379926022c6868f125014b2b not found: ID does not exist" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.262798 4762 scope.go:117] "RemoveContainer" containerID="e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.265055 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.270908 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3\": container with ID starting with e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3 not found: ID does not exist" containerID="e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.270942 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3"} err="failed to get container status \"e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3\": rpc error: code = NotFound desc = could not find container \"e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3\": container with ID starting with e175432f2bcb680da18dc6b1db1f29fbefee40d93535d694c55f92beccf1a7a3 not found: ID does not exist" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.270963 4762 scope.go:117] "RemoveContainer" containerID="0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.284052 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430\": container with ID starting with 0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430 not found: ID does not exist" containerID="0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.284102 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430"} err="failed to get container status \"0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430\": rpc error: code = NotFound desc = could not find container \"0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430\": container with ID starting with 0e42226bf4411bdffd791d1f73ea0af6ac6f0054cd21e3a196ecf2ac6356c430 not found: ID does not exist" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.286510 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.287108 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="ceilometer-notification-agent" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287133 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="ceilometer-notification-agent" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.287170 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="sg-core" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287181 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="sg-core" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.287204 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="proxy-httpd" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287212 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="proxy-httpd" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.287260 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api-log" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287270 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api-log" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.287288 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287297 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287574 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287612 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="sg-core" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287800 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="proxy-httpd" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287827 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" containerName="ceilometer-notification-agent" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.287838 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="81febbb2-748e-4ca9-a7aa-279aed792ffa" containerName="barbican-api-log" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.290933 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.297099 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.297296 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.330047 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.384786 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-config-data\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.384860 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-scripts\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.384932 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.384992 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpqn\" (UniqueName: \"kubernetes.io/projected/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-kube-api-access-drpqn\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.385153 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-run-httpd\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.385208 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-log-httpd\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.385558 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488054 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpqn\" (UniqueName: \"kubernetes.io/projected/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-kube-api-access-drpqn\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488164 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-run-httpd\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488186 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-log-httpd\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488274 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488403 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-config-data\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488437 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-scripts\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488467 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488708 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-run-httpd\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.488805 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-log-httpd\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.493311 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.494751 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.498285 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-config-data\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.498508 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.500519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-scripts\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: E0217 14:31:26.503720 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-drpqn scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="47460499-0eb9-4fcb-bd2b-8e7084f6f26c" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.521522 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpqn\" (UniqueName: \"kubernetes.io/projected/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-kube-api-access-drpqn\") pod \"ceilometer-0\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.842229 4762 generic.go:334] "Generic (PLEG): container finished" podID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerID="d5637ab010ca30227d0f7953c7c27e73d747e7dceb945206c765e4da83221f3c" exitCode=0 Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.842379 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerDied","Data":"d5637ab010ca30227d0f7953c7c27e73d747e7dceb945206c765e4da83221f3c"} Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.854246 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"048d8d34-8b8e-4267-9747-2db21026d3a8","Type":"ContainerStarted","Data":"3a13966f1628333c681ed865663c47a77c9866660dcbab2cb8c040e6a9d1f5e8"} Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.863284 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.888286 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:26 crc kubenswrapper[4762]: I0217 14:31:26.903232 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.903183793 podStartE2EDuration="4.903183793s" podCreationTimestamp="2026-02-17 14:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:26.898849176 +0000 UTC m=+1567.478849848" watchObservedRunningTime="2026-02-17 14:31:26.903183793 +0000 UTC m=+1567.483184445" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000032 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-scripts\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000157 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-config-data\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000277 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-run-httpd\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000305 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-log-httpd\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000421 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-sg-core-conf-yaml\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000564 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drpqn\" (UniqueName: \"kubernetes.io/projected/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-kube-api-access-drpqn\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.000637 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-combined-ca-bundle\") pod \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\" (UID: \"47460499-0eb9-4fcb-bd2b-8e7084f6f26c\") " Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.002527 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.002778 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.008197 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-config-data" (OuterVolumeSpecName: "config-data") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.026013 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.030812 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.035789 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-scripts" (OuterVolumeSpecName: "scripts") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.056251 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-kube-api-access-drpqn" (OuterVolumeSpecName: "kube-api-access-drpqn") pod "47460499-0eb9-4fcb-bd2b-8e7084f6f26c" (UID: "47460499-0eb9-4fcb-bd2b-8e7084f6f26c"). InnerVolumeSpecName "kube-api-access-drpqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110247 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drpqn\" (UniqueName: \"kubernetes.io/projected/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-kube-api-access-drpqn\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110278 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110287 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110295 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110306 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110314 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.110322 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47460499-0eb9-4fcb-bd2b-8e7084f6f26c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.491523 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-f8f7cc6b-9bscz"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.493358 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.500727 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.500969 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-mhg26" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.501079 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.530059 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-f8f7cc6b-9bscz"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.618755 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-wntzm"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.621162 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.663992 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56hq\" (UniqueName: \"kubernetes.io/projected/37fd57d6-2520-488b-9ce4-c316d6d62bc5-kube-api-access-d56hq\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.664563 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data-custom\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.664976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-combined-ca-bundle\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.665403 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.694717 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-wntzm"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.773832 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.773950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774074 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d56hq\" (UniqueName: \"kubernetes.io/projected/37fd57d6-2520-488b-9ce4-c316d6d62bc5-kube-api-access-d56hq\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774093 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data-custom\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774214 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6dj\" (UniqueName: \"kubernetes.io/projected/7f033533-f8f8-4196-9fdd-31a14b0f019d-kube-api-access-mw6dj\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774323 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774346 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-combined-ca-bundle\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774383 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.774568 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-config\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.796566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.803428 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data-custom\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.807679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-combined-ca-bundle\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.827305 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6fcd77bc97-54sbg"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.828871 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.834311 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.841031 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56hq\" (UniqueName: \"kubernetes.io/projected/37fd57d6-2520-488b-9ce4-c316d6d62bc5-kube-api-access-d56hq\") pod \"heat-engine-f8f7cc6b-9bscz\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.841876 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fcd77bc97-54sbg"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.881400 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.881542 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6dj\" (UniqueName: \"kubernetes.io/projected/7f033533-f8f8-4196-9fdd-31a14b0f019d-kube-api-access-mw6dj\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.881593 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.881622 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.881699 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.881725 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-config\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.882602 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-config\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.883665 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-59c546d4cd-5fhzh"] Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.885803 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.891631 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.892215 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.892584 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.922872 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.923180 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.974916 4762 generic.go:334] "Generic (PLEG): container finished" podID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerID="40bfadd0be5a49cf632f62cc2d679da6a27b3b7606bb06e8c319ffb998c7a00a" exitCode=0 Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.978522 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47bdcf85-g4f9w" event={"ID":"922b4fd8-4192-45a2-9fad-c6e49f93e9eb","Type":"ContainerDied","Data":"40bfadd0be5a49cf632f62cc2d679da6a27b3b7606bb06e8c319ffb998c7a00a"} Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.978591 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47bdcf85-g4f9w" event={"ID":"922b4fd8-4192-45a2-9fad-c6e49f93e9eb","Type":"ContainerDied","Data":"68a4c0f317049f6e5a3b6e386a3b51373cb86361a48c1cf8b73104ded7c8361a"} Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.978619 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a4c0f317049f6e5a3b6e386a3b51373cb86361a48c1cf8b73104ded7c8361a" Feb 17 14:31:27 crc kubenswrapper[4762]: I0217 14:31:27.978779 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.001555 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.009167 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.009512 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvb85\" (UniqueName: \"kubernetes.io/projected/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-kube-api-access-gvb85\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.009698 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data-custom\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.005401 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59c546d4cd-5fhzh"] Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.009896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-combined-ca-bundle\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.010037 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lhx\" (UniqueName: \"kubernetes.io/projected/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-kube-api-access-k9lhx\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.010224 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data-custom\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.010447 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-combined-ca-bundle\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.047846 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6dj\" (UniqueName: \"kubernetes.io/projected/7f033533-f8f8-4196-9fdd-31a14b0f019d-kube-api-access-mw6dj\") pod \"dnsmasq-dns-688b9f5b49-wntzm\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118284 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvb85\" (UniqueName: \"kubernetes.io/projected/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-kube-api-access-gvb85\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118355 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data-custom\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-combined-ca-bundle\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118442 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lhx\" (UniqueName: \"kubernetes.io/projected/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-kube-api-access-k9lhx\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118497 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data-custom\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118586 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-combined-ca-bundle\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.118787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.125568 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data-custom\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.126073 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.126905 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-combined-ca-bundle\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.127163 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-combined-ca-bundle\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.134853 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.135582 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data-custom\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.135729 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.145490 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvb85\" (UniqueName: \"kubernetes.io/projected/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-kube-api-access-gvb85\") pod \"heat-cfnapi-6fcd77bc97-54sbg\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.146202 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lhx\" (UniqueName: \"kubernetes.io/projected/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-kube-api-access-k9lhx\") pod \"heat-api-59c546d4cd-5fhzh\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.261837 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.315180 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d225d9-98bc-48c2-94a2-0c74c3f11d89" path="/var/lib/kubelet/pods/a4d225d9-98bc-48c2-94a2-0c74c3f11d89/volumes" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.404109 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.404182 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.420867 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.451750 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-combined-ca-bundle\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.451882 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-config\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.452068 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-httpd-config\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.452154 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-internal-tls-certs\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.452321 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lhjh\" (UniqueName: \"kubernetes.io/projected/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-kube-api-access-2lhjh\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.452389 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-public-tls-certs\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.452440 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-ovndb-tls-certs\") pod \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\" (UID: \"922b4fd8-4192-45a2-9fad-c6e49f93e9eb\") " Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.462928 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-kube-api-access-2lhjh" (OuterVolumeSpecName: "kube-api-access-2lhjh") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "kube-api-access-2lhjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.500203 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.556887 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lhjh\" (UniqueName: \"kubernetes.io/projected/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-kube-api-access-2lhjh\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.557225 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.613021 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.649819 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.649863 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.659462 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.659500 4762 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.684330 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-config" (OuterVolumeSpecName: "config") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.761400 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.767318 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.834458 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "922b4fd8-4192-45a2-9fad-c6e49f93e9eb" (UID: "922b4fd8-4192-45a2-9fad-c6e49f93e9eb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.836288 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-f8f7cc6b-9bscz"] Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.865507 4762 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4762]: I0217 14:31:28.865544 4762 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/922b4fd8-4192-45a2-9fad-c6e49f93e9eb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:29 crc kubenswrapper[4762]: W0217 14:31:29.001655 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37fd57d6_2520_488b_9ce4_c316d6d62bc5.slice/crio-cce4265dee757d4d3c19fd2007ddbb035894233315f7cfd4bc4fd2ea8cafa854 WatchSource:0}: Error finding container cce4265dee757d4d3c19fd2007ddbb035894233315f7cfd4bc4fd2ea8cafa854: Status 404 returned error can't find the container with id cce4265dee757d4d3c19fd2007ddbb035894233315f7cfd4bc4fd2ea8cafa854 Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.025091 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47bdcf85-g4f9w" Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.025420 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerStarted","Data":"33b44dc7093f08ac9b8db042dc7d3a5ae8459428ed86fa37213473b5159d80d0"} Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.331976 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f47bdcf85-g4f9w"] Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.357711 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f47bdcf85-g4f9w"] Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.394957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-wntzm"] Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.427052 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.427904 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bfd9c8d59-mxmfg" Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.491751 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fcd77bc97-54sbg"] Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.503197 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 14:31:29 crc kubenswrapper[4762]: I0217 14:31:29.620571 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59c546d4cd-5fhzh"] Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.057297 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c546d4cd-5fhzh" event={"ID":"7f10d6b8-9fc3-478a-aee3-accc92b73dfa","Type":"ContainerStarted","Data":"a40c558d66dd8410d087050ee1bf53b604317a4487addbb3ce31b3f5f73239b3"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.063400 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" event={"ID":"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4","Type":"ContainerStarted","Data":"0fd3e5a55ce7feca0c415028a13cd5a3950d06d4749372fd20b45af8328994a6"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.090285 4762 generic.go:334] "Generic (PLEG): container finished" podID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerID="e4953faad0e578de9b5623a5cfa350b5b1615f2951a2f3335e22b610c29c27a2" exitCode=0 Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.162435 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" path="/var/lib/kubelet/pods/922b4fd8-4192-45a2-9fad-c6e49f93e9eb/volumes" Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.172746 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.172809 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" event={"ID":"7f033533-f8f8-4196-9fdd-31a14b0f019d","Type":"ContainerDied","Data":"e4953faad0e578de9b5623a5cfa350b5b1615f2951a2f3335e22b610c29c27a2"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.172832 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" event={"ID":"7f033533-f8f8-4196-9fdd-31a14b0f019d","Type":"ContainerStarted","Data":"da8f2182c8d9b8762d3460dfcded9af6ff36eb8838370579dd722e5bcb95a16d"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.172843 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f8f7cc6b-9bscz" event={"ID":"37fd57d6-2520-488b-9ce4-c316d6d62bc5","Type":"ContainerStarted","Data":"c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.172854 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f8f7cc6b-9bscz" event={"ID":"37fd57d6-2520-488b-9ce4-c316d6d62bc5","Type":"ContainerStarted","Data":"cce4265dee757d4d3c19fd2007ddbb035894233315f7cfd4bc4fd2ea8cafa854"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.190254 4762 generic.go:334] "Generic (PLEG): container finished" podID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerID="33b44dc7093f08ac9b8db042dc7d3a5ae8459428ed86fa37213473b5159d80d0" exitCode=0 Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.191911 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerDied","Data":"33b44dc7093f08ac9b8db042dc7d3a5ae8459428ed86fa37213473b5159d80d0"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.191947 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerStarted","Data":"2c899ca16dbffc9ffd16c176d1a5962956dfca67f29dc0f5ed988a1d66008235"} Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.384197 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5x5bg" podStartSLOduration=3.642148521 podStartE2EDuration="6.38416927s" podCreationTimestamp="2026-02-17 14:31:24 +0000 UTC" firstStartedPulling="2026-02-17 14:31:26.849993162 +0000 UTC m=+1567.429993814" lastFinishedPulling="2026-02-17 14:31:29.592013911 +0000 UTC m=+1570.172014563" observedRunningTime="2026-02-17 14:31:30.320379272 +0000 UTC m=+1570.900379924" watchObservedRunningTime="2026-02-17 14:31:30.38416927 +0000 UTC m=+1570.964169922" Feb 17 14:31:30 crc kubenswrapper[4762]: I0217 14:31:30.423474 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-f8f7cc6b-9bscz" podStartSLOduration=3.423451514 podStartE2EDuration="3.423451514s" podCreationTimestamp="2026-02-17 14:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:30.364167468 +0000 UTC m=+1570.944168130" watchObservedRunningTime="2026-02-17 14:31:30.423451514 +0000 UTC m=+1571.003452166" Feb 17 14:31:31 crc kubenswrapper[4762]: I0217 14:31:31.214726 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" event={"ID":"7f033533-f8f8-4196-9fdd-31a14b0f019d","Type":"ContainerStarted","Data":"4f18091437fbcbef71845fdabfa8e7449abbec763e140344c9ad8714c7304977"} Feb 17 14:31:31 crc kubenswrapper[4762]: I0217 14:31:31.216266 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:31 crc kubenswrapper[4762]: I0217 14:31:31.251858 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" podStartSLOduration=4.251833444 podStartE2EDuration="4.251833444s" podCreationTimestamp="2026-02-17 14:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:31.236077958 +0000 UTC m=+1571.816078630" watchObservedRunningTime="2026-02-17 14:31:31.251833444 +0000 UTC m=+1571.831834096" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.021614 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.257005 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" event={"ID":"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4","Type":"ContainerStarted","Data":"6e559f62380a05ade8cc510cf20d2dca772deb9fd9d11188930dfb8296d82cce"} Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.257121 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.260089 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c546d4cd-5fhzh" event={"ID":"7f10d6b8-9fc3-478a-aee3-accc92b73dfa","Type":"ContainerStarted","Data":"3edd36da835045c104685eeac4fa3aec31a9b1c68918bf13613ede68ee59feab"} Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.260335 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.285838 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" podStartSLOduration=3.798605147 podStartE2EDuration="7.285811782s" podCreationTimestamp="2026-02-17 14:31:27 +0000 UTC" firstStartedPulling="2026-02-17 14:31:29.574755093 +0000 UTC m=+1570.154755745" lastFinishedPulling="2026-02-17 14:31:33.061961728 +0000 UTC m=+1573.641962380" observedRunningTime="2026-02-17 14:31:34.270529468 +0000 UTC m=+1574.850530140" watchObservedRunningTime="2026-02-17 14:31:34.285811782 +0000 UTC m=+1574.865812424" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.299581 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-59c546d4cd-5fhzh" podStartSLOduration=3.915834242 podStartE2EDuration="7.299561824s" podCreationTimestamp="2026-02-17 14:31:27 +0000 UTC" firstStartedPulling="2026-02-17 14:31:29.677572179 +0000 UTC m=+1570.257572831" lastFinishedPulling="2026-02-17 14:31:33.061299761 +0000 UTC m=+1573.641300413" observedRunningTime="2026-02-17 14:31:34.295305909 +0000 UTC m=+1574.875306561" watchObservedRunningTime="2026-02-17 14:31:34.299561824 +0000 UTC m=+1574.879562476" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.728875 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.729228 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.923759 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-68c7cc4b78-lr6mt"] Feb 17 14:31:34 crc kubenswrapper[4762]: E0217 14:31:34.924372 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-api" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.924398 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-api" Feb 17 14:31:34 crc kubenswrapper[4762]: E0217 14:31:34.924417 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-httpd" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.924428 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-httpd" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.924747 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-api" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.924801 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="922b4fd8-4192-45a2-9fad-c6e49f93e9eb" containerName="neutron-httpd" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.929925 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.956422 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-68c7cc4b78-lr6mt"] Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.991790 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-68d86764f7-2hn2f"] Feb 17 14:31:34 crc kubenswrapper[4762]: I0217 14:31:34.995136 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.005208 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6545f49b85-762lt"] Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.007204 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.033713 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68d86764f7-2hn2f"] Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.052631 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6545f49b85-762lt"] Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115143 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-config-data-custom\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115234 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lftlj\" (UniqueName: \"kubernetes.io/projected/abea76c2-c351-4c12-85c0-fb86db09cdd1-kube-api-access-lftlj\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115280 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-combined-ca-bundle\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115316 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-combined-ca-bundle\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115371 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-config-data\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115396 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpmzc\" (UniqueName: \"kubernetes.io/projected/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-kube-api-access-qpmzc\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115510 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk78d\" (UniqueName: \"kubernetes.io/projected/d19729e1-9b79-4762-821b-10ccba91c176-kube-api-access-kk78d\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115554 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115588 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data-custom\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115614 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data-custom\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115670 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.115723 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-combined-ca-bundle\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217678 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-config-data\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217720 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpmzc\" (UniqueName: \"kubernetes.io/projected/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-kube-api-access-qpmzc\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217824 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk78d\" (UniqueName: \"kubernetes.io/projected/d19729e1-9b79-4762-821b-10ccba91c176-kube-api-access-kk78d\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217854 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data-custom\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data-custom\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217911 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217932 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-combined-ca-bundle\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.217993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-config-data-custom\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.218037 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lftlj\" (UniqueName: \"kubernetes.io/projected/abea76c2-c351-4c12-85c0-fb86db09cdd1-kube-api-access-lftlj\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.218065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-combined-ca-bundle\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.218102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-combined-ca-bundle\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.228931 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-config-data\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.230381 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.231234 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-combined-ca-bundle\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.231575 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data-custom\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.232400 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-combined-ca-bundle\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.236555 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-combined-ca-bundle\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.249839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data-custom\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.250462 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19729e1-9b79-4762-821b-10ccba91c176-config-data-custom\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.250672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpmzc\" (UniqueName: \"kubernetes.io/projected/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-kube-api-access-qpmzc\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.251334 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk78d\" (UniqueName: \"kubernetes.io/projected/d19729e1-9b79-4762-821b-10ccba91c176-kube-api-access-kk78d\") pod \"heat-engine-68c7cc4b78-lr6mt\" (UID: \"d19729e1-9b79-4762-821b-10ccba91c176\") " pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.252078 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data\") pod \"heat-cfnapi-68d86764f7-2hn2f\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.254274 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lftlj\" (UniqueName: \"kubernetes.io/projected/abea76c2-c351-4c12-85c0-fb86db09cdd1-kube-api-access-lftlj\") pod \"heat-api-6545f49b85-762lt\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.255008 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.335452 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.352976 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:35 crc kubenswrapper[4762]: I0217 14:31:35.788764 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5x5bg" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="registry-server" probeResult="failure" output=< Feb 17 14:31:35 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:31:35 crc kubenswrapper[4762]: > Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.034376 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59c546d4cd-5fhzh"] Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.034951 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-59c546d4cd-5fhzh" podUID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" containerName="heat-api" containerID="cri-o://3edd36da835045c104685eeac4fa3aec31a9b1c68918bf13613ede68ee59feab" gracePeriod=60 Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.058081 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fcd77bc97-54sbg"] Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.058323 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" podUID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" containerName="heat-cfnapi" containerID="cri-o://6e559f62380a05ade8cc510cf20d2dca772deb9fd9d11188930dfb8296d82cce" gracePeriod=60 Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.107108 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6885f6c5bd-nskzc"] Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.109100 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.122535 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.125677 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.137969 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-579766b5b-pgs2q"] Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.152239 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.165364 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.165598 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.195527 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6885f6c5bd-nskzc"] Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275317 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-public-tls-certs\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275363 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-config-data-custom\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275458 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849rj\" (UniqueName: \"kubernetes.io/projected/d0e19e34-aa03-40bc-8f4b-3604a80d6683-kube-api-access-849rj\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275486 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-combined-ca-bundle\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-internal-tls-certs\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275529 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-combined-ca-bundle\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-config-data\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275635 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-internal-tls-certs\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275703 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-public-tls-certs\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275718 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-config-data\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-config-data-custom\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.275800 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g96k\" (UniqueName: \"kubernetes.io/projected/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-kube-api-access-6g96k\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.302232 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-579766b5b-pgs2q"] Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385208 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g96k\" (UniqueName: \"kubernetes.io/projected/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-kube-api-access-6g96k\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385301 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-public-tls-certs\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385327 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-config-data-custom\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385440 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849rj\" (UniqueName: \"kubernetes.io/projected/d0e19e34-aa03-40bc-8f4b-3604a80d6683-kube-api-access-849rj\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385472 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-combined-ca-bundle\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385509 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-combined-ca-bundle\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385528 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-internal-tls-certs\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385558 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-config-data\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385659 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-internal-tls-certs\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385715 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-public-tls-certs\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-config-data\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.385782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-config-data-custom\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.410969 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-config-data\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.411894 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-config-data-custom\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.413296 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-config-data-custom\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.418469 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-public-tls-certs\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.419333 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-internal-tls-certs\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.420477 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-combined-ca-bundle\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.424957 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-public-tls-certs\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.424983 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-internal-tls-certs\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.427224 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e19e34-aa03-40bc-8f4b-3604a80d6683-combined-ca-bundle\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.429864 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-config-data\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.430765 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g96k\" (UniqueName: \"kubernetes.io/projected/58b7d970-aa37-44b3-b64b-a55bcf38f7cb-kube-api-access-6g96k\") pod \"heat-api-6885f6c5bd-nskzc\" (UID: \"58b7d970-aa37-44b3-b64b-a55bcf38f7cb\") " pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.435399 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.439496 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849rj\" (UniqueName: \"kubernetes.io/projected/d0e19e34-aa03-40bc-8f4b-3604a80d6683-kube-api-access-849rj\") pod \"heat-cfnapi-579766b5b-pgs2q\" (UID: \"d0e19e34-aa03-40bc-8f4b-3604a80d6683\") " pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:37 crc kubenswrapper[4762]: I0217 14:31:37.500858 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.264669 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.346100 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zmxjz"] Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.346375 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="dnsmasq-dns" containerID="cri-o://93c59150e6f56455566c0992cf1e3e192dfdc61550db8c1d7bbc64ab523ef0db" gracePeriod=10 Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.347487 4762 generic.go:334] "Generic (PLEG): container finished" podID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" containerID="3edd36da835045c104685eeac4fa3aec31a9b1c68918bf13613ede68ee59feab" exitCode=0 Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.347518 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c546d4cd-5fhzh" event={"ID":"7f10d6b8-9fc3-478a-aee3-accc92b73dfa","Type":"ContainerDied","Data":"3edd36da835045c104685eeac4fa3aec31a9b1c68918bf13613ede68ee59feab"} Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.432797 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-59c546d4cd-5fhzh" podUID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.221:8004/healthcheck\": dial tcp 10.217.0.221:8004: connect: connection refused" Feb 17 14:31:38 crc kubenswrapper[4762]: I0217 14:31:38.620265 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" podUID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.220:8000/healthcheck\": read tcp 10.217.0.2:45590->10.217.0.220:8000: read: connection reset by peer" Feb 17 14:31:39 crc kubenswrapper[4762]: I0217 14:31:39.368962 4762 generic.go:334] "Generic (PLEG): container finished" podID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerID="93c59150e6f56455566c0992cf1e3e192dfdc61550db8c1d7bbc64ab523ef0db" exitCode=0 Feb 17 14:31:39 crc kubenswrapper[4762]: I0217 14:31:39.369023 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" event={"ID":"65bff6fa-f7aa-4b40-ae05-169a575e6096","Type":"ContainerDied","Data":"93c59150e6f56455566c0992cf1e3e192dfdc61550db8c1d7bbc64ab523ef0db"} Feb 17 14:31:39 crc kubenswrapper[4762]: I0217 14:31:39.370871 4762 generic.go:334] "Generic (PLEG): container finished" podID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" containerID="6e559f62380a05ade8cc510cf20d2dca772deb9fd9d11188930dfb8296d82cce" exitCode=0 Feb 17 14:31:39 crc kubenswrapper[4762]: I0217 14:31:39.370920 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" event={"ID":"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4","Type":"ContainerDied","Data":"6e559f62380a05ade8cc510cf20d2dca772deb9fd9d11188930dfb8296d82cce"} Feb 17 14:31:40 crc kubenswrapper[4762]: I0217 14:31:40.078549 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:31:40 crc kubenswrapper[4762]: E0217 14:31:40.079079 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.227117 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.273529 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.276120 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.316562 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-combined-ca-bundle\") pod \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.316932 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data\") pod \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.317088 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvb85\" (UniqueName: \"kubernetes.io/projected/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-kube-api-access-gvb85\") pod \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.317371 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data-custom\") pod \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\" (UID: \"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.354710 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" (UID: "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.368171 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-kube-api-access-gvb85" (OuterVolumeSpecName: "kube-api-access-gvb85") pod "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" (UID: "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4"). InnerVolumeSpecName "kube-api-access-gvb85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.413195 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c9dd2323-04a9-409b-b035-7d086e4eaef6","Type":"ContainerStarted","Data":"d3d4610086a5d124547ae3745637aa35d9e991447b2278ab402db36930936099"} Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.416016 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c546d4cd-5fhzh" event={"ID":"7f10d6b8-9fc3-478a-aee3-accc92b73dfa","Type":"ContainerDied","Data":"a40c558d66dd8410d087050ee1bf53b604317a4487addbb3ce31b3f5f73239b3"} Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.416085 4762 scope.go:117] "RemoveContainer" containerID="3edd36da835045c104685eeac4fa3aec31a9b1c68918bf13613ede68ee59feab" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.416285 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c546d4cd-5fhzh" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.426181 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-sb\") pod \"65bff6fa-f7aa-4b40-ae05-169a575e6096\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.426370 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9lhx\" (UniqueName: \"kubernetes.io/projected/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-kube-api-access-k9lhx\") pod \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430068 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data-custom\") pod \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430151 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cwdb\" (UniqueName: \"kubernetes.io/projected/65bff6fa-f7aa-4b40-ae05-169a575e6096-kube-api-access-2cwdb\") pod \"65bff6fa-f7aa-4b40-ae05-169a575e6096\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430306 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-nb\") pod \"65bff6fa-f7aa-4b40-ae05-169a575e6096\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430358 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-combined-ca-bundle\") pod \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430448 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-svc\") pod \"65bff6fa-f7aa-4b40-ae05-169a575e6096\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430504 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data\") pod \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\" (UID: \"7f10d6b8-9fc3-478a-aee3-accc92b73dfa\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430531 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-config\") pod \"65bff6fa-f7aa-4b40-ae05-169a575e6096\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.430563 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-swift-storage-0\") pod \"65bff6fa-f7aa-4b40-ae05-169a575e6096\" (UID: \"65bff6fa-f7aa-4b40-ae05-169a575e6096\") " Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.431695 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvb85\" (UniqueName: \"kubernetes.io/projected/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-kube-api-access-gvb85\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.431720 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.433680 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-kube-api-access-k9lhx" (OuterVolumeSpecName: "kube-api-access-k9lhx") pod "7f10d6b8-9fc3-478a-aee3-accc92b73dfa" (UID: "7f10d6b8-9fc3-478a-aee3-accc92b73dfa"). InnerVolumeSpecName "kube-api-access-k9lhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.435684 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" event={"ID":"3e9e6a53-bbe8-48f0-92a3-235040cfc7d4","Type":"ContainerDied","Data":"0fd3e5a55ce7feca0c415028a13cd5a3950d06d4749372fd20b45af8328994a6"} Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.435798 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fcd77bc97-54sbg" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.447887 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" event={"ID":"65bff6fa-f7aa-4b40-ae05-169a575e6096","Type":"ContainerDied","Data":"2db46896d334f0e74452a92b99c92527d0e4cc01e446e52a5f7078fda797892b"} Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.447957 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.448364 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f10d6b8-9fc3-478a-aee3-accc92b73dfa" (UID: "7f10d6b8-9fc3-478a-aee3-accc92b73dfa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.460319 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.608124478 podStartE2EDuration="26.460288471s" podCreationTimestamp="2026-02-17 14:31:16 +0000 UTC" firstStartedPulling="2026-02-17 14:31:18.000031696 +0000 UTC m=+1558.580032348" lastFinishedPulling="2026-02-17 14:31:41.852195689 +0000 UTC m=+1582.432196341" observedRunningTime="2026-02-17 14:31:42.434098142 +0000 UTC m=+1583.014098794" watchObservedRunningTime="2026-02-17 14:31:42.460288471 +0000 UTC m=+1583.040289113" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.472903 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bff6fa-f7aa-4b40-ae05-169a575e6096-kube-api-access-2cwdb" (OuterVolumeSpecName: "kube-api-access-2cwdb") pod "65bff6fa-f7aa-4b40-ae05-169a575e6096" (UID: "65bff6fa-f7aa-4b40-ae05-169a575e6096"). InnerVolumeSpecName "kube-api-access-2cwdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.476426 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" (UID: "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.512001 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data" (OuterVolumeSpecName: "config-data") pod "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" (UID: "3e9e6a53-bbe8-48f0-92a3-235040cfc7d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.543452 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.543483 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9lhx\" (UniqueName: \"kubernetes.io/projected/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-kube-api-access-k9lhx\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.543493 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.543505 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cwdb\" (UniqueName: \"kubernetes.io/projected/65bff6fa-f7aa-4b40-ae05-169a575e6096-kube-api-access-2cwdb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.543515 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.544045 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65bff6fa-f7aa-4b40-ae05-169a575e6096" (UID: "65bff6fa-f7aa-4b40-ae05-169a575e6096"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.563033 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data" (OuterVolumeSpecName: "config-data") pod "7f10d6b8-9fc3-478a-aee3-accc92b73dfa" (UID: "7f10d6b8-9fc3-478a-aee3-accc92b73dfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.591501 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "65bff6fa-f7aa-4b40-ae05-169a575e6096" (UID: "65bff6fa-f7aa-4b40-ae05-169a575e6096"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.628774 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f10d6b8-9fc3-478a-aee3-accc92b73dfa" (UID: "7f10d6b8-9fc3-478a-aee3-accc92b73dfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.650337 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.650396 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.650411 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f10d6b8-9fc3-478a-aee3-accc92b73dfa-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.650420 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.654503 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-config" (OuterVolumeSpecName: "config") pod "65bff6fa-f7aa-4b40-ae05-169a575e6096" (UID: "65bff6fa-f7aa-4b40-ae05-169a575e6096"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.661218 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65bff6fa-f7aa-4b40-ae05-169a575e6096" (UID: "65bff6fa-f7aa-4b40-ae05-169a575e6096"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.666236 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65bff6fa-f7aa-4b40-ae05-169a575e6096" (UID: "65bff6fa-f7aa-4b40-ae05-169a575e6096"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.680885 4762 scope.go:117] "RemoveContainer" containerID="6e559f62380a05ade8cc510cf20d2dca772deb9fd9d11188930dfb8296d82cce" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.733360 4762 scope.go:117] "RemoveContainer" containerID="93c59150e6f56455566c0992cf1e3e192dfdc61550db8c1d7bbc64ab523ef0db" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.759934 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.759974 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.759989 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65bff6fa-f7aa-4b40-ae05-169a575e6096-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.776088 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6545f49b85-762lt"] Feb 17 14:31:42 crc kubenswrapper[4762]: W0217 14:31:42.792136 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd19729e1_9b79_4762_821b_10ccba91c176.slice/crio-db1d6b894a3a7bfacbf03259a92233afe6e8e5e79eb83e6d6db6a164f6b13a2c WatchSource:0}: Error finding container db1d6b894a3a7bfacbf03259a92233afe6e8e5e79eb83e6d6db6a164f6b13a2c: Status 404 returned error can't find the container with id db1d6b894a3a7bfacbf03259a92233afe6e8e5e79eb83e6d6db6a164f6b13a2c Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.803989 4762 scope.go:117] "RemoveContainer" containerID="80f2662feae74d8b54a324a35f9f3dee6b653f1f6a0420e7070729dac06143a7" Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.825954 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-68c7cc4b78-lr6mt"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.845042 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6885f6c5bd-nskzc"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.864355 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68d86764f7-2hn2f"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.877904 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59c546d4cd-5fhzh"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.896810 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-59c546d4cd-5fhzh"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.926839 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zmxjz"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.942924 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-zmxjz"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.958977 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fcd77bc97-54sbg"] Feb 17 14:31:42 crc kubenswrapper[4762]: I0217 14:31:42.977399 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6fcd77bc97-54sbg"] Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.004282 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-579766b5b-pgs2q"] Feb 17 14:31:43 crc kubenswrapper[4762]: W0217 14:31:43.043807 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e19e34_aa03_40bc_8f4b_3604a80d6683.slice/crio-f2885177097929aa7d6a962cbe8be3c93d9a1e6fc6ad2b2d7389b86a866c8b28 WatchSource:0}: Error finding container f2885177097929aa7d6a962cbe8be3c93d9a1e6fc6ad2b2d7389b86a866c8b28: Status 404 returned error can't find the container with id f2885177097929aa7d6a962cbe8be3c93d9a1e6fc6ad2b2d7389b86a866c8b28 Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.423211 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.423531 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-log" containerID="cri-o://edb0b37b8e520ee4aef70d35fcf290ea941c0e99ba43b8495f41be5f2c8163b6" gracePeriod=30 Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.423679 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-httpd" containerID="cri-o://269c14e2b5e7f2da1726887ab2d0730d9718b9f869f69708d78797d066565255" gracePeriod=30 Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.461583 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6885f6c5bd-nskzc" event={"ID":"58b7d970-aa37-44b3-b64b-a55bcf38f7cb","Type":"ContainerStarted","Data":"e4f3f7bd39c5ba4a9bfc752fbd33c4231ec8836b1c5dfcad52d6b6e8dae43b0f"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.464127 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-68c7cc4b78-lr6mt" event={"ID":"d19729e1-9b79-4762-821b-10ccba91c176","Type":"ContainerStarted","Data":"5fb62751382f148ffba8e54ac58d13200c604dcb4f1ce52afbbb60aced91c2b9"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.464172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-68c7cc4b78-lr6mt" event={"ID":"d19729e1-9b79-4762-821b-10ccba91c176","Type":"ContainerStarted","Data":"db1d6b894a3a7bfacbf03259a92233afe6e8e5e79eb83e6d6db6a164f6b13a2c"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.464315 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.471071 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-579766b5b-pgs2q" event={"ID":"d0e19e34-aa03-40bc-8f4b-3604a80d6683","Type":"ContainerStarted","Data":"f2885177097929aa7d6a962cbe8be3c93d9a1e6fc6ad2b2d7389b86a866c8b28"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.489147 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" event={"ID":"19953d0a-f2bb-4e7c-b5fc-44218a467dc9","Type":"ContainerStarted","Data":"a992d57ddd1f55ad229d97f1aae1c95c31f7850e69056aebe3c1ea53d0645cd6"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.491696 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-68c7cc4b78-lr6mt" podStartSLOduration=9.49167449 podStartE2EDuration="9.49167449s" podCreationTimestamp="2026-02-17 14:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:43.482165743 +0000 UTC m=+1584.062166395" watchObservedRunningTime="2026-02-17 14:31:43.49167449 +0000 UTC m=+1584.071675142" Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.495940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6545f49b85-762lt" event={"ID":"abea76c2-c351-4c12-85c0-fb86db09cdd1","Type":"ContainerStarted","Data":"8d9ec03afab85f57d1cba14b17960dbcc3471ab2bd62ade1164c76f360add337"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.495997 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6545f49b85-762lt" event={"ID":"abea76c2-c351-4c12-85c0-fb86db09cdd1","Type":"ContainerStarted","Data":"cd044d0be7f349e3f9b44c9a5f711eb99d541fed131316eb937a343639bfc54d"} Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.496097 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:43 crc kubenswrapper[4762]: I0217 14:31:43.516533 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6545f49b85-762lt" podStartSLOduration=9.516515733 podStartE2EDuration="9.516515733s" podCreationTimestamp="2026-02-17 14:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:43.513732418 +0000 UTC m=+1584.093733070" watchObservedRunningTime="2026-02-17 14:31:43.516515733 +0000 UTC m=+1584.096516385" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.084276 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" path="/var/lib/kubelet/pods/3e9e6a53-bbe8-48f0-92a3-235040cfc7d4/volumes" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.084866 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" path="/var/lib/kubelet/pods/65bff6fa-f7aa-4b40-ae05-169a575e6096/volumes" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.085488 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" path="/var/lib/kubelet/pods/7f10d6b8-9fc3-478a-aee3-accc92b73dfa/volumes" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.514899 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6885f6c5bd-nskzc" event={"ID":"58b7d970-aa37-44b3-b64b-a55bcf38f7cb","Type":"ContainerStarted","Data":"55964415a2b8cf01066e840d09ba4c339f41472ce9cdee932da68bc0d49e266f"} Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.515017 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.523456 4762 generic.go:334] "Generic (PLEG): container finished" podID="85f7c024-456d-460f-b09f-77b5e8e10498" containerID="edb0b37b8e520ee4aef70d35fcf290ea941c0e99ba43b8495f41be5f2c8163b6" exitCode=143 Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.523531 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f7c024-456d-460f-b09f-77b5e8e10498","Type":"ContainerDied","Data":"edb0b37b8e520ee4aef70d35fcf290ea941c0e99ba43b8495f41be5f2c8163b6"} Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.526189 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-579766b5b-pgs2q" event={"ID":"d0e19e34-aa03-40bc-8f4b-3604a80d6683","Type":"ContainerStarted","Data":"2c2c93700da8668a1a5739bcfcadded62aed9dd05024691544bb07e6dfb51449"} Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.527434 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.531102 4762 generic.go:334] "Generic (PLEG): container finished" podID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerID="94aafde610b5e6ef47e8eca421c3236e26e373577ac1447c611c9e74a2b5aa5e" exitCode=1 Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.531182 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" event={"ID":"19953d0a-f2bb-4e7c-b5fc-44218a467dc9","Type":"ContainerDied","Data":"94aafde610b5e6ef47e8eca421c3236e26e373577ac1447c611c9e74a2b5aa5e"} Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.532063 4762 scope.go:117] "RemoveContainer" containerID="94aafde610b5e6ef47e8eca421c3236e26e373577ac1447c611c9e74a2b5aa5e" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.535926 4762 generic.go:334] "Generic (PLEG): container finished" podID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerID="8d9ec03afab85f57d1cba14b17960dbcc3471ab2bd62ade1164c76f360add337" exitCode=1 Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.536839 4762 scope.go:117] "RemoveContainer" containerID="8d9ec03afab85f57d1cba14b17960dbcc3471ab2bd62ade1164c76f360add337" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.537121 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6545f49b85-762lt" event={"ID":"abea76c2-c351-4c12-85c0-fb86db09cdd1","Type":"ContainerDied","Data":"8d9ec03afab85f57d1cba14b17960dbcc3471ab2bd62ade1164c76f360add337"} Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.548539 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6885f6c5bd-nskzc" podStartSLOduration=7.548516849 podStartE2EDuration="7.548516849s" podCreationTimestamp="2026-02-17 14:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:44.539634519 +0000 UTC m=+1585.119635171" watchObservedRunningTime="2026-02-17 14:31:44.548516849 +0000 UTC m=+1585.128517501" Feb 17 14:31:44 crc kubenswrapper[4762]: I0217 14:31:44.575430 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-579766b5b-pgs2q" podStartSLOduration=7.575400078 podStartE2EDuration="7.575400078s" podCreationTimestamp="2026-02-17 14:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:44.562040246 +0000 UTC m=+1585.142040888" watchObservedRunningTime="2026-02-17 14:31:44.575400078 +0000 UTC m=+1585.155400730" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.337137 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.337476 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.354390 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.549622 4762 generic.go:334] "Generic (PLEG): container finished" podID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerID="dd43b889ee7e21f1e1a649f2868838306f495dfd5e53582ad34ca0747b4409cd" exitCode=1 Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.549707 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6545f49b85-762lt" event={"ID":"abea76c2-c351-4c12-85c0-fb86db09cdd1","Type":"ContainerDied","Data":"dd43b889ee7e21f1e1a649f2868838306f495dfd5e53582ad34ca0747b4409cd"} Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.549742 4762 scope.go:117] "RemoveContainer" containerID="8d9ec03afab85f57d1cba14b17960dbcc3471ab2bd62ade1164c76f360add337" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.550582 4762 scope.go:117] "RemoveContainer" containerID="dd43b889ee7e21f1e1a649f2868838306f495dfd5e53582ad34ca0747b4409cd" Feb 17 14:31:45 crc kubenswrapper[4762]: E0217 14:31:45.550919 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6545f49b85-762lt_openstack(abea76c2-c351-4c12-85c0-fb86db09cdd1)\"" pod="openstack/heat-api-6545f49b85-762lt" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.556402 4762 generic.go:334] "Generic (PLEG): container finished" podID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerID="1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667" exitCode=1 Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.558074 4762 scope.go:117] "RemoveContainer" containerID="1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667" Feb 17 14:31:45 crc kubenswrapper[4762]: E0217 14:31:45.558365 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-68d86764f7-2hn2f_openstack(19953d0a-f2bb-4e7c-b5fc-44218a467dc9)\"" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.558406 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" event={"ID":"19953d0a-f2bb-4e7c-b5fc-44218a467dc9","Type":"ContainerDied","Data":"1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667"} Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.634211 4762 scope.go:117] "RemoveContainer" containerID="94aafde610b5e6ef47e8eca421c3236e26e373577ac1447c611c9e74a2b5aa5e" Feb 17 14:31:45 crc kubenswrapper[4762]: I0217 14:31:45.787020 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5x5bg" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="registry-server" probeResult="failure" output=< Feb 17 14:31:45 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:31:45 crc kubenswrapper[4762]: > Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.181500 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jljhd"] Feb 17 14:31:46 crc kubenswrapper[4762]: E0217 14:31:46.182585 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" containerName="heat-api" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182603 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" containerName="heat-api" Feb 17 14:31:46 crc kubenswrapper[4762]: E0217 14:31:46.182620 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" containerName="heat-cfnapi" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182625 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" containerName="heat-cfnapi" Feb 17 14:31:46 crc kubenswrapper[4762]: E0217 14:31:46.182653 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="init" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182659 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="init" Feb 17 14:31:46 crc kubenswrapper[4762]: E0217 14:31:46.182703 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="dnsmasq-dns" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182709 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="dnsmasq-dns" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182915 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="dnsmasq-dns" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182936 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9e6a53-bbe8-48f0-92a3-235040cfc7d4" containerName="heat-cfnapi" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.182949 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f10d6b8-9fc3-478a-aee3-accc92b73dfa" containerName="heat-api" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.183783 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.212815 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jljhd"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.250990 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9rqk\" (UniqueName: \"kubernetes.io/projected/bb8711f3-a902-4c23-8c91-3e8819cc74ca-kube-api-access-f9rqk\") pod \"nova-api-db-create-jljhd\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.251217 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8711f3-a902-4c23-8c91-3e8819cc74ca-operator-scripts\") pod \"nova-api-db-create-jljhd\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.267736 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nnss4"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.269418 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.284121 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0142-account-create-update-9mv69"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.286369 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.292018 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.299517 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nnss4"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.311168 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0142-account-create-update-9mv69"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.353698 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9rqk\" (UniqueName: \"kubernetes.io/projected/bb8711f3-a902-4c23-8c91-3e8819cc74ca-kube-api-access-f9rqk\") pod \"nova-api-db-create-jljhd\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.353812 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8njdt\" (UniqueName: \"kubernetes.io/projected/277ee237-c640-42ab-8439-d23e72f087e1-kube-api-access-8njdt\") pod \"nova-api-0142-account-create-update-9mv69\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.353905 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq7hb\" (UniqueName: \"kubernetes.io/projected/da99eccd-0482-4e64-bb27-6b87437ae8ba-kube-api-access-wq7hb\") pod \"nova-cell0-db-create-nnss4\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.353933 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8711f3-a902-4c23-8c91-3e8819cc74ca-operator-scripts\") pod \"nova-api-db-create-jljhd\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.353959 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da99eccd-0482-4e64-bb27-6b87437ae8ba-operator-scripts\") pod \"nova-cell0-db-create-nnss4\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.354216 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/277ee237-c640-42ab-8439-d23e72f087e1-operator-scripts\") pod \"nova-api-0142-account-create-update-9mv69\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.355468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8711f3-a902-4c23-8c91-3e8819cc74ca-operator-scripts\") pod \"nova-api-db-create-jljhd\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.373913 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9rqk\" (UniqueName: \"kubernetes.io/projected/bb8711f3-a902-4c23-8c91-3e8819cc74ca-kube-api-access-f9rqk\") pod \"nova-api-db-create-jljhd\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.456634 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kz5nv"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.456712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/277ee237-c640-42ab-8439-d23e72f087e1-operator-scripts\") pod \"nova-api-0142-account-create-update-9mv69\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.456943 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8njdt\" (UniqueName: \"kubernetes.io/projected/277ee237-c640-42ab-8439-d23e72f087e1-kube-api-access-8njdt\") pod \"nova-api-0142-account-create-update-9mv69\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.457038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq7hb\" (UniqueName: \"kubernetes.io/projected/da99eccd-0482-4e64-bb27-6b87437ae8ba-kube-api-access-wq7hb\") pod \"nova-cell0-db-create-nnss4\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.457064 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da99eccd-0482-4e64-bb27-6b87437ae8ba-operator-scripts\") pod \"nova-cell0-db-create-nnss4\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.457900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da99eccd-0482-4e64-bb27-6b87437ae8ba-operator-scripts\") pod \"nova-cell0-db-create-nnss4\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.458440 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/277ee237-c640-42ab-8439-d23e72f087e1-operator-scripts\") pod \"nova-api-0142-account-create-update-9mv69\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.460218 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.477116 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8njdt\" (UniqueName: \"kubernetes.io/projected/277ee237-c640-42ab-8439-d23e72f087e1-kube-api-access-8njdt\") pod \"nova-api-0142-account-create-update-9mv69\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.479756 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kz5nv"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.492328 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq7hb\" (UniqueName: \"kubernetes.io/projected/da99eccd-0482-4e64-bb27-6b87437ae8ba-kube-api-access-wq7hb\") pod \"nova-cell0-db-create-nnss4\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.498615 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8886-account-create-update-w9f55"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.500449 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.504918 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.506799 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.566331 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7242r\" (UniqueName: \"kubernetes.io/projected/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-kube-api-access-7242r\") pod \"nova-cell1-db-create-kz5nv\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.566450 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-operator-scripts\") pod \"nova-cell1-db-create-kz5nv\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.566591 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-operator-scripts\") pod \"nova-cell0-8886-account-create-update-w9f55\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.566632 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qftcq\" (UniqueName: \"kubernetes.io/projected/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-kube-api-access-qftcq\") pod \"nova-cell0-8886-account-create-update-w9f55\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.581391 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8886-account-create-update-w9f55"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.602005 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.621262 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.640992 4762 scope.go:117] "RemoveContainer" containerID="1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667" Feb 17 14:31:46 crc kubenswrapper[4762]: E0217 14:31:46.641313 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-68d86764f7-2hn2f_openstack(19953d0a-f2bb-4e7c-b5fc-44218a467dc9)\"" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.663717 4762 scope.go:117] "RemoveContainer" containerID="dd43b889ee7e21f1e1a649f2868838306f495dfd5e53582ad34ca0747b4409cd" Feb 17 14:31:46 crc kubenswrapper[4762]: E0217 14:31:46.669161 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6545f49b85-762lt_openstack(abea76c2-c351-4c12-85c0-fb86db09cdd1)\"" pod="openstack/heat-api-6545f49b85-762lt" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.683685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7242r\" (UniqueName: \"kubernetes.io/projected/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-kube-api-access-7242r\") pod \"nova-cell1-db-create-kz5nv\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.684081 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-operator-scripts\") pod \"nova-cell1-db-create-kz5nv\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.684373 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-operator-scripts\") pod \"nova-cell0-8886-account-create-update-w9f55\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.684420 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qftcq\" (UniqueName: \"kubernetes.io/projected/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-kube-api-access-qftcq\") pod \"nova-cell0-8886-account-create-update-w9f55\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.686143 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-operator-scripts\") pod \"nova-cell1-db-create-kz5nv\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.686527 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-operator-scripts\") pod \"nova-cell0-8886-account-create-update-w9f55\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.744822 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9c9e-account-create-update-2865f"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.745309 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qftcq\" (UniqueName: \"kubernetes.io/projected/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-kube-api-access-qftcq\") pod \"nova-cell0-8886-account-create-update-w9f55\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.746762 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.749240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7242r\" (UniqueName: \"kubernetes.io/projected/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-kube-api-access-7242r\") pod \"nova-cell1-db-create-kz5nv\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.756741 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.783742 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9c9e-account-create-update-2865f"] Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.836988 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.886626 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.889865 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmk2z\" (UniqueName: \"kubernetes.io/projected/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-kube-api-access-kmk2z\") pod \"nova-cell1-9c9e-account-create-update-2865f\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.889964 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-operator-scripts\") pod \"nova-cell1-9c9e-account-create-update-2865f\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.994749 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmk2z\" (UniqueName: \"kubernetes.io/projected/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-kube-api-access-kmk2z\") pod \"nova-cell1-9c9e-account-create-update-2865f\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:46 crc kubenswrapper[4762]: I0217 14:31:46.994900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-operator-scripts\") pod \"nova-cell1-9c9e-account-create-update-2865f\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:46.997673 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-operator-scripts\") pod \"nova-cell1-9c9e-account-create-update-2865f\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.047304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmk2z\" (UniqueName: \"kubernetes.io/projected/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-kube-api-access-kmk2z\") pod \"nova-cell1-9c9e-account-create-update-2865f\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.106782 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-zmxjz" podUID="65bff6fa-f7aa-4b40-ae05-169a575e6096" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.209:5353: i/o timeout" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.152692 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.328268 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jljhd"] Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.576925 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0142-account-create-update-9mv69"] Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.679165 4762 generic.go:334] "Generic (PLEG): container finished" podID="85f7c024-456d-460f-b09f-77b5e8e10498" containerID="269c14e2b5e7f2da1726887ab2d0730d9718b9f869f69708d78797d066565255" exitCode=0 Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.679258 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f7c024-456d-460f-b09f-77b5e8e10498","Type":"ContainerDied","Data":"269c14e2b5e7f2da1726887ab2d0730d9718b9f869f69708d78797d066565255"} Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.693475 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jljhd" event={"ID":"bb8711f3-a902-4c23-8c91-3e8819cc74ca","Type":"ContainerStarted","Data":"33f97202480ecfda56e480dc6249c5de214583f94de8cfdbe0667c9701d847ce"} Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.693528 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jljhd" event={"ID":"bb8711f3-a902-4c23-8c91-3e8819cc74ca","Type":"ContainerStarted","Data":"4fc2611331a89b0e03cca3d9dfd19975d5bb01dd34d9d25087839a7ff13a5574"} Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.709052 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0142-account-create-update-9mv69" event={"ID":"277ee237-c640-42ab-8439-d23e72f087e1","Type":"ContainerStarted","Data":"9fd0786d903842cc5519a80589fc58d325593c23574a84d48404795618d93194"} Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.709554 4762 scope.go:117] "RemoveContainer" containerID="1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667" Feb 17 14:31:47 crc kubenswrapper[4762]: E0217 14:31:47.709974 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-68d86764f7-2hn2f_openstack(19953d0a-f2bb-4e7c-b5fc-44218a467dc9)\"" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.734812 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-jljhd" podStartSLOduration=1.7347911919999999 podStartE2EDuration="1.734791192s" podCreationTimestamp="2026-02-17 14:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:47.725022068 +0000 UTC m=+1588.305022730" watchObservedRunningTime="2026-02-17 14:31:47.734791192 +0000 UTC m=+1588.314791844" Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.758701 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8886-account-create-update-w9f55"] Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.808756 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nnss4"] Feb 17 14:31:47 crc kubenswrapper[4762]: I0217 14:31:47.982388 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kz5nv"] Feb 17 14:31:48 crc kubenswrapper[4762]: W0217 14:31:48.000238 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6bb5440_4045_43cc_acbd_a61bc6b8efa7.slice/crio-6b8574c2d6307e3bf9f31d6c0b67812594c5f1e748cd0ca392a0213de51af918 WatchSource:0}: Error finding container 6b8574c2d6307e3bf9f31d6c0b67812594c5f1e748cd0ca392a0213de51af918: Status 404 returned error can't find the container with id 6b8574c2d6307e3bf9f31d6c0b67812594c5f1e748cd0ca392a0213de51af918 Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.138084 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.145102 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9c9e-account-create-update-2865f"] Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.189841 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231064 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rctfg\" (UniqueName: \"kubernetes.io/projected/85f7c024-456d-460f-b09f-77b5e8e10498-kube-api-access-rctfg\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231133 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-internal-tls-certs\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231220 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-httpd-run\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231268 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-scripts\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231361 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-combined-ca-bundle\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231413 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-logs\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.231511 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-config-data\") pod \"85f7c024-456d-460f-b09f-77b5e8e10498\" (UID: \"85f7c024-456d-460f-b09f-77b5e8e10498\") " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.232040 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.246334 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-logs" (OuterVolumeSpecName: "logs") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.393962 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.393998 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85f7c024-456d-460f-b09f-77b5e8e10498-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.542198 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-scripts" (OuterVolumeSpecName: "scripts") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.542882 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f7c024-456d-460f-b09f-77b5e8e10498-kube-api-access-rctfg" (OuterVolumeSpecName: "kube-api-access-rctfg") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "kube-api-access-rctfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.611533 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rctfg\" (UniqueName: \"kubernetes.io/projected/85f7c024-456d-460f-b09f-77b5e8e10498-kube-api-access-rctfg\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.611572 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.629191 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f" (OuterVolumeSpecName: "glance") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.713519 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") on node \"crc\" " Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.738607 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8886-account-create-update-w9f55" event={"ID":"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8","Type":"ContainerStarted","Data":"73297b536a093f8cfe7bdaf06c10d9fb0994bd62ea41652f37bfcbab4296d283"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.738687 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8886-account-create-update-w9f55" event={"ID":"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8","Type":"ContainerStarted","Data":"8d408c1a5700ab0565d29e367433b46ab77f60e1189eab71751a00f211863984"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.745071 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"85f7c024-456d-460f-b09f-77b5e8e10498","Type":"ContainerDied","Data":"c52e7a3c95daf9c0b479235656d1ffc6ff961388e379530da8215e363c02e4db"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.745121 4762 scope.go:117] "RemoveContainer" containerID="269c14e2b5e7f2da1726887ab2d0730d9718b9f869f69708d78797d066565255" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.745262 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.748200 4762 generic.go:334] "Generic (PLEG): container finished" podID="bb8711f3-a902-4c23-8c91-3e8819cc74ca" containerID="33f97202480ecfda56e480dc6249c5de214583f94de8cfdbe0667c9701d847ce" exitCode=0 Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.748359 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jljhd" event={"ID":"bb8711f3-a902-4c23-8c91-3e8819cc74ca","Type":"ContainerDied","Data":"33f97202480ecfda56e480dc6249c5de214583f94de8cfdbe0667c9701d847ce"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.753768 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0142-account-create-update-9mv69" event={"ID":"277ee237-c640-42ab-8439-d23e72f087e1","Type":"ContainerStarted","Data":"c11054ab4bee3fbdac5eb4396c9b77028cc1f98238cda254ac44fa4f621f54e6"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.767313 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nnss4" event={"ID":"da99eccd-0482-4e64-bb27-6b87437ae8ba","Type":"ContainerStarted","Data":"496cc57796dd27fdb322dce4f895bd33a74f61948764b2bbf10850f997eeef14"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.768437 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nnss4" event={"ID":"da99eccd-0482-4e64-bb27-6b87437ae8ba","Type":"ContainerStarted","Data":"2e16de10dc0ff58d9cb4c93b3c9c26982e9fb9a0efd31f9cb69e3358bb4d14db"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.772233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" event={"ID":"d5fb9f5e-d096-4b3d-82cb-881bcc844cab","Type":"ContainerStarted","Data":"ae0892c4709b090d30a835e244e6c586f89d967c853512219593fc836202cae5"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.774715 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kz5nv" event={"ID":"b6bb5440-4045-43cc-acbd-a61bc6b8efa7","Type":"ContainerStarted","Data":"6b8574c2d6307e3bf9f31d6c0b67812594c5f1e748cd0ca392a0213de51af918"} Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.775497 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-8886-account-create-update-w9f55" podStartSLOduration=2.775480994 podStartE2EDuration="2.775480994s" podCreationTimestamp="2026-02-17 14:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:48.758903195 +0000 UTC m=+1589.338903847" watchObservedRunningTime="2026-02-17 14:31:48.775480994 +0000 UTC m=+1589.355481666" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.783661 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.789979 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0142-account-create-update-9mv69" podStartSLOduration=2.789959896 podStartE2EDuration="2.789959896s" podCreationTimestamp="2026-02-17 14:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:48.787332235 +0000 UTC m=+1589.367332907" watchObservedRunningTime="2026-02-17 14:31:48.789959896 +0000 UTC m=+1589.369960538" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.791333 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.791486 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f") on node "crc" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.806229 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-config-data" (OuterVolumeSpecName: "config-data") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.824430 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.830195 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.830212 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.830101 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "85f7c024-456d-460f-b09f-77b5e8e10498" (UID: "85f7c024-456d-460f-b09f-77b5e8e10498"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.839297 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-nnss4" podStartSLOduration=2.839278872 podStartE2EDuration="2.839278872s" podCreationTimestamp="2026-02-17 14:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:48.820696899 +0000 UTC m=+1589.400697551" watchObservedRunningTime="2026-02-17 14:31:48.839278872 +0000 UTC m=+1589.419279524" Feb 17 14:31:48 crc kubenswrapper[4762]: I0217 14:31:48.933030 4762 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f7c024-456d-460f-b09f-77b5e8e10498-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.176614 4762 scope.go:117] "RemoveContainer" containerID="edb0b37b8e520ee4aef70d35fcf290ea941c0e99ba43b8495f41be5f2c8163b6" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.234669 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.251394 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.266088 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:31:49 crc kubenswrapper[4762]: E0217 14:31:49.266747 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-log" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.266765 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-log" Feb 17 14:31:49 crc kubenswrapper[4762]: E0217 14:31:49.266787 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-httpd" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.266794 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-httpd" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.267024 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-httpd" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.267058 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" containerName="glance-log" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.268421 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.272721 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.273802 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.304915 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.345793 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.346543 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-log" containerID="cri-o://80f9aa22b822f0b15afdc8fa63b813a132cb5897e20b1c25212e7e3ca7e5cd55" gracePeriod=30 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.347289 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-httpd" containerID="cri-o://0b62a9d98e888b0e0dc59d942af63064b26f4e10cb512add83ab42d2ca101810" gracePeriod=30 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.457520 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c92f5203-d922-420b-9537-34cb7656e78c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.457604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.457713 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.457785 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.457858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.457877 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8p7z\" (UniqueName: \"kubernetes.io/projected/c92f5203-d922-420b-9537-34cb7656e78c-kube-api-access-b8p7z\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.458235 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92f5203-d922-420b-9537-34cb7656e78c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.458581 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561240 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c92f5203-d922-420b-9537-34cb7656e78c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561308 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561340 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561390 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561415 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8p7z\" (UniqueName: \"kubernetes.io/projected/c92f5203-d922-420b-9537-34cb7656e78c-kube-api-access-b8p7z\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561510 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92f5203-d922-420b-9537-34cb7656e78c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561592 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.561834 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c92f5203-d922-420b-9537-34cb7656e78c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.562369 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92f5203-d922-420b-9537-34cb7656e78c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.571280 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.573715 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.573766 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c94ac0752a1dcb91ec40ba4c560720e8a8734d2d1a06b78b6730ccf35fc18fc/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.585578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8p7z\" (UniqueName: \"kubernetes.io/projected/c92f5203-d922-420b-9537-34cb7656e78c-kube-api-access-b8p7z\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.585674 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.585782 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.589768 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92f5203-d922-420b-9537-34cb7656e78c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.652237 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85cee8bc-d5fd-4c7a-bc39-77678be6fa8f\") pod \"glance-default-internal-api-0\" (UID: \"c92f5203-d922-420b-9537-34cb7656e78c\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.797819 4762 generic.go:334] "Generic (PLEG): container finished" podID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerID="80f9aa22b822f0b15afdc8fa63b813a132cb5897e20b1c25212e7e3ca7e5cd55" exitCode=143 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.797967 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2a357fec-26ca-4478-8ec4-34b141dbe886","Type":"ContainerDied","Data":"80f9aa22b822f0b15afdc8fa63b813a132cb5897e20b1c25212e7e3ca7e5cd55"} Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.801158 4762 generic.go:334] "Generic (PLEG): container finished" podID="277ee237-c640-42ab-8439-d23e72f087e1" containerID="c11054ab4bee3fbdac5eb4396c9b77028cc1f98238cda254ac44fa4f621f54e6" exitCode=0 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.801269 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0142-account-create-update-9mv69" event={"ID":"277ee237-c640-42ab-8439-d23e72f087e1","Type":"ContainerDied","Data":"c11054ab4bee3fbdac5eb4396c9b77028cc1f98238cda254ac44fa4f621f54e6"} Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.803334 4762 generic.go:334] "Generic (PLEG): container finished" podID="da99eccd-0482-4e64-bb27-6b87437ae8ba" containerID="496cc57796dd27fdb322dce4f895bd33a74f61948764b2bbf10850f997eeef14" exitCode=0 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.803417 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nnss4" event={"ID":"da99eccd-0482-4e64-bb27-6b87437ae8ba","Type":"ContainerDied","Data":"496cc57796dd27fdb322dce4f895bd33a74f61948764b2bbf10850f997eeef14"} Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.805577 4762 generic.go:334] "Generic (PLEG): container finished" podID="d5fb9f5e-d096-4b3d-82cb-881bcc844cab" containerID="fcaecfe9e3ce19cb2373ae5e2053e815efa636d9678d4dffc4d12d0db7ebc9dd" exitCode=0 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.805694 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" event={"ID":"d5fb9f5e-d096-4b3d-82cb-881bcc844cab","Type":"ContainerDied","Data":"fcaecfe9e3ce19cb2373ae5e2053e815efa636d9678d4dffc4d12d0db7ebc9dd"} Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.808442 4762 generic.go:334] "Generic (PLEG): container finished" podID="b6bb5440-4045-43cc-acbd-a61bc6b8efa7" containerID="8281960df4711a0ed57712cf1c3d31c153c2d3903dbfc30b5ee22eae721aeb48" exitCode=0 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.808521 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kz5nv" event={"ID":"b6bb5440-4045-43cc-acbd-a61bc6b8efa7","Type":"ContainerDied","Data":"8281960df4711a0ed57712cf1c3d31c153c2d3903dbfc30b5ee22eae721aeb48"} Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.810791 4762 generic.go:334] "Generic (PLEG): container finished" podID="8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" containerID="73297b536a093f8cfe7bdaf06c10d9fb0994bd62ea41652f37bfcbab4296d283" exitCode=0 Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.810836 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8886-account-create-update-w9f55" event={"ID":"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8","Type":"ContainerDied","Data":"73297b536a093f8cfe7bdaf06c10d9fb0994bd62ea41652f37bfcbab4296d283"} Feb 17 14:31:49 crc kubenswrapper[4762]: I0217 14:31:49.942759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.093366 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f7c024-456d-460f-b09f-77b5e8e10498" path="/var/lib/kubelet/pods/85f7c024-456d-460f-b09f-77b5e8e10498/volumes" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.336908 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.337230 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.338166 4762 scope.go:117] "RemoveContainer" containerID="1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667" Feb 17 14:31:50 crc kubenswrapper[4762]: E0217 14:31:50.338560 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-68d86764f7-2hn2f_openstack(19953d0a-f2bb-4e7c-b5fc-44218a467dc9)\"" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.354507 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.354554 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.355815 4762 scope.go:117] "RemoveContainer" containerID="dd43b889ee7e21f1e1a649f2868838306f495dfd5e53582ad34ca0747b4409cd" Feb 17 14:31:50 crc kubenswrapper[4762]: E0217 14:31:50.356138 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6545f49b85-762lt_openstack(abea76c2-c351-4c12-85c0-fb86db09cdd1)\"" pod="openstack/heat-api-6545f49b85-762lt" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.398904 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.497707 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9rqk\" (UniqueName: \"kubernetes.io/projected/bb8711f3-a902-4c23-8c91-3e8819cc74ca-kube-api-access-f9rqk\") pod \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.497944 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8711f3-a902-4c23-8c91-3e8819cc74ca-operator-scripts\") pod \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\" (UID: \"bb8711f3-a902-4c23-8c91-3e8819cc74ca\") " Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.498610 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8711f3-a902-4c23-8c91-3e8819cc74ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb8711f3-a902-4c23-8c91-3e8819cc74ca" (UID: "bb8711f3-a902-4c23-8c91-3e8819cc74ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.503852 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8711f3-a902-4c23-8c91-3e8819cc74ca-kube-api-access-f9rqk" (OuterVolumeSpecName: "kube-api-access-f9rqk") pod "bb8711f3-a902-4c23-8c91-3e8819cc74ca" (UID: "bb8711f3-a902-4c23-8c91-3e8819cc74ca"). InnerVolumeSpecName "kube-api-access-f9rqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.602654 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb8711f3-a902-4c23-8c91-3e8819cc74ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.602685 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9rqk\" (UniqueName: \"kubernetes.io/projected/bb8711f3-a902-4c23-8c91-3e8819cc74ca-kube-api-access-f9rqk\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.610782 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.831061 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c92f5203-d922-420b-9537-34cb7656e78c","Type":"ContainerStarted","Data":"b6f2f0649cbaadb1f745d64923fe567401e52c8978c85aac37467ae7a4c4a0f8"} Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.834696 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jljhd" event={"ID":"bb8711f3-a902-4c23-8c91-3e8819cc74ca","Type":"ContainerDied","Data":"4fc2611331a89b0e03cca3d9dfd19975d5bb01dd34d9d25087839a7ff13a5574"} Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.834730 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jljhd" Feb 17 14:31:50 crc kubenswrapper[4762]: I0217 14:31:50.834751 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc2611331a89b0e03cca3d9dfd19975d5bb01dd34d9d25087839a7ff13a5574" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.562264 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.733340 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/277ee237-c640-42ab-8439-d23e72f087e1-operator-scripts\") pod \"277ee237-c640-42ab-8439-d23e72f087e1\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.734087 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/277ee237-c640-42ab-8439-d23e72f087e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "277ee237-c640-42ab-8439-d23e72f087e1" (UID: "277ee237-c640-42ab-8439-d23e72f087e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.734499 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8njdt\" (UniqueName: \"kubernetes.io/projected/277ee237-c640-42ab-8439-d23e72f087e1-kube-api-access-8njdt\") pod \"277ee237-c640-42ab-8439-d23e72f087e1\" (UID: \"277ee237-c640-42ab-8439-d23e72f087e1\") " Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.735730 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/277ee237-c640-42ab-8439-d23e72f087e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.740471 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277ee237-c640-42ab-8439-d23e72f087e1-kube-api-access-8njdt" (OuterVolumeSpecName: "kube-api-access-8njdt") pod "277ee237-c640-42ab-8439-d23e72f087e1" (UID: "277ee237-c640-42ab-8439-d23e72f087e1"). InnerVolumeSpecName "kube-api-access-8njdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.837520 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8njdt\" (UniqueName: \"kubernetes.io/projected/277ee237-c640-42ab-8439-d23e72f087e1-kube-api-access-8njdt\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.865549 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" event={"ID":"d5fb9f5e-d096-4b3d-82cb-881bcc844cab","Type":"ContainerDied","Data":"ae0892c4709b090d30a835e244e6c586f89d967c853512219593fc836202cae5"} Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.865595 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0892c4709b090d30a835e244e6c586f89d967c853512219593fc836202cae5" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.871771 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c92f5203-d922-420b-9537-34cb7656e78c","Type":"ContainerStarted","Data":"31710c7d7ca0ce006251aa79aef0154678c4106336128ba034b57430e3bfeb0b"} Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.874283 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kz5nv" event={"ID":"b6bb5440-4045-43cc-acbd-a61bc6b8efa7","Type":"ContainerDied","Data":"6b8574c2d6307e3bf9f31d6c0b67812594c5f1e748cd0ca392a0213de51af918"} Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.874319 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8574c2d6307e3bf9f31d6c0b67812594c5f1e748cd0ca392a0213de51af918" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.880597 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8886-account-create-update-w9f55" event={"ID":"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8","Type":"ContainerDied","Data":"8d408c1a5700ab0565d29e367433b46ab77f60e1189eab71751a00f211863984"} Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.880665 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d408c1a5700ab0565d29e367433b46ab77f60e1189eab71751a00f211863984" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.882013 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0142-account-create-update-9mv69" event={"ID":"277ee237-c640-42ab-8439-d23e72f087e1","Type":"ContainerDied","Data":"9fd0786d903842cc5519a80589fc58d325593c23574a84d48404795618d93194"} Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.882040 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd0786d903842cc5519a80589fc58d325593c23574a84d48404795618d93194" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.882108 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0142-account-create-update-9mv69" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.887213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nnss4" event={"ID":"da99eccd-0482-4e64-bb27-6b87437ae8ba","Type":"ContainerDied","Data":"2e16de10dc0ff58d9cb4c93b3c9c26982e9fb9a0efd31f9cb69e3358bb4d14db"} Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.887284 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e16de10dc0ff58d9cb4c93b3c9c26982e9fb9a0efd31f9cb69e3358bb4d14db" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.915441 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.964031 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.978983 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:51 crc kubenswrapper[4762]: I0217 14:31:51.992923 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.044691 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da99eccd-0482-4e64-bb27-6b87437ae8ba-operator-scripts\") pod \"da99eccd-0482-4e64-bb27-6b87437ae8ba\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.044839 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq7hb\" (UniqueName: \"kubernetes.io/projected/da99eccd-0482-4e64-bb27-6b87437ae8ba-kube-api-access-wq7hb\") pod \"da99eccd-0482-4e64-bb27-6b87437ae8ba\" (UID: \"da99eccd-0482-4e64-bb27-6b87437ae8ba\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.045975 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da99eccd-0482-4e64-bb27-6b87437ae8ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da99eccd-0482-4e64-bb27-6b87437ae8ba" (UID: "da99eccd-0482-4e64-bb27-6b87437ae8ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.046575 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da99eccd-0482-4e64-bb27-6b87437ae8ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.057982 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da99eccd-0482-4e64-bb27-6b87437ae8ba-kube-api-access-wq7hb" (OuterVolumeSpecName: "kube-api-access-wq7hb") pod "da99eccd-0482-4e64-bb27-6b87437ae8ba" (UID: "da99eccd-0482-4e64-bb27-6b87437ae8ba"). InnerVolumeSpecName "kube-api-access-wq7hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.070994 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:31:52 crc kubenswrapper[4762]: E0217 14:31:52.071454 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149029 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-operator-scripts\") pod \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149087 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-operator-scripts\") pod \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149183 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-operator-scripts\") pod \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149218 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7242r\" (UniqueName: \"kubernetes.io/projected/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-kube-api-access-7242r\") pod \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\" (UID: \"b6bb5440-4045-43cc-acbd-a61bc6b8efa7\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149280 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qftcq\" (UniqueName: \"kubernetes.io/projected/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-kube-api-access-qftcq\") pod \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\" (UID: \"8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149318 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmk2z\" (UniqueName: \"kubernetes.io/projected/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-kube-api-access-kmk2z\") pod \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\" (UID: \"d5fb9f5e-d096-4b3d-82cb-881bcc844cab\") " Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149547 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" (UID: "8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.149749 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6bb5440-4045-43cc-acbd-a61bc6b8efa7" (UID: "b6bb5440-4045-43cc-acbd-a61bc6b8efa7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.150319 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5fb9f5e-d096-4b3d-82cb-881bcc844cab" (UID: "d5fb9f5e-d096-4b3d-82cb-881bcc844cab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.150498 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.150522 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq7hb\" (UniqueName: \"kubernetes.io/projected/da99eccd-0482-4e64-bb27-6b87437ae8ba-kube-api-access-wq7hb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.150535 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.154886 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-kube-api-access-qftcq" (OuterVolumeSpecName: "kube-api-access-qftcq") pod "8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" (UID: "8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8"). InnerVolumeSpecName "kube-api-access-qftcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.154900 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-kube-api-access-7242r" (OuterVolumeSpecName: "kube-api-access-7242r") pod "b6bb5440-4045-43cc-acbd-a61bc6b8efa7" (UID: "b6bb5440-4045-43cc-acbd-a61bc6b8efa7"). InnerVolumeSpecName "kube-api-access-7242r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.155919 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-kube-api-access-kmk2z" (OuterVolumeSpecName: "kube-api-access-kmk2z") pod "d5fb9f5e-d096-4b3d-82cb-881bcc844cab" (UID: "d5fb9f5e-d096-4b3d-82cb-881bcc844cab"). InnerVolumeSpecName "kube-api-access-kmk2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.253241 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7242r\" (UniqueName: \"kubernetes.io/projected/b6bb5440-4045-43cc-acbd-a61bc6b8efa7-kube-api-access-7242r\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.253536 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qftcq\" (UniqueName: \"kubernetes.io/projected/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8-kube-api-access-qftcq\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.253552 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmk2z\" (UniqueName: \"kubernetes.io/projected/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-kube-api-access-kmk2z\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.253565 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5fb9f5e-d096-4b3d-82cb-881bcc844cab-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.928366 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c92f5203-d922-420b-9537-34cb7656e78c","Type":"ContainerStarted","Data":"7825c2747790a3dfc5d146a9b6d2ab68366be4e866d11498bd95675743445fb7"} Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.941214 4762 generic.go:334] "Generic (PLEG): container finished" podID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerID="0b62a9d98e888b0e0dc59d942af63064b26f4e10cb512add83ab42d2ca101810" exitCode=0 Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.941591 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c9e-account-create-update-2865f" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.942770 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8886-account-create-update-w9f55" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.942949 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nnss4" Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.943782 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2a357fec-26ca-4478-8ec4-34b141dbe886","Type":"ContainerDied","Data":"0b62a9d98e888b0e0dc59d942af63064b26f4e10cb512add83ab42d2ca101810"} Feb 17 14:31:52 crc kubenswrapper[4762]: I0217 14:31:52.943943 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz5nv" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.006576 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.005736928 podStartE2EDuration="4.005736928s" podCreationTimestamp="2026-02-17 14:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:52.960604405 +0000 UTC m=+1593.540605057" watchObservedRunningTime="2026-02-17 14:31:53.005736928 +0000 UTC m=+1593.585737580" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.375894 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bzsgc"] Feb 17 14:31:53 crc kubenswrapper[4762]: E0217 14:31:53.376881 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fb9f5e-d096-4b3d-82cb-881bcc844cab" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.376905 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fb9f5e-d096-4b3d-82cb-881bcc844cab" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: E0217 14:31:53.376932 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.376940 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: E0217 14:31:53.376979 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bb5440-4045-43cc-acbd-a61bc6b8efa7" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.376988 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bb5440-4045-43cc-acbd-a61bc6b8efa7" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: E0217 14:31:53.377009 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da99eccd-0482-4e64-bb27-6b87437ae8ba" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.377016 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99eccd-0482-4e64-bb27-6b87437ae8ba" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: E0217 14:31:53.377033 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277ee237-c640-42ab-8439-d23e72f087e1" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.377042 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="277ee237-c640-42ab-8439-d23e72f087e1" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: E0217 14:31:53.377055 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8711f3-a902-4c23-8c91-3e8819cc74ca" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.377063 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8711f3-a902-4c23-8c91-3e8819cc74ca" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.382472 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="da99eccd-0482-4e64-bb27-6b87437ae8ba" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.382536 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bb5440-4045-43cc-acbd-a61bc6b8efa7" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.382589 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="277ee237-c640-42ab-8439-d23e72f087e1" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.382605 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.382622 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5fb9f5e-d096-4b3d-82cb-881bcc844cab" containerName="mariadb-account-create-update" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.382631 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8711f3-a902-4c23-8c91-3e8819cc74ca" containerName="mariadb-database-create" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.385999 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.414480 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzsgc"] Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.485015 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpctl\" (UniqueName: \"kubernetes.io/projected/88939b89-be48-48f3-85c6-542eea161552-kube-api-access-rpctl\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.485123 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-utilities\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.485573 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-catalog-content\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.588170 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-utilities\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.588456 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-catalog-content\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.588752 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpctl\" (UniqueName: \"kubernetes.io/projected/88939b89-be48-48f3-85c6-542eea161552-kube-api-access-rpctl\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.588917 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-catalog-content\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.589573 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-utilities\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.609898 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpctl\" (UniqueName: \"kubernetes.io/projected/88939b89-be48-48f3-85c6-542eea161552-kube-api-access-rpctl\") pod \"certified-operators-bzsgc\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.726751 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.730920 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.899288 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2fq\" (UniqueName: \"kubernetes.io/projected/2a357fec-26ca-4478-8ec4-34b141dbe886-kube-api-access-mt2fq\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.899560 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-combined-ca-bundle\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.899683 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-logs\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.899736 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-httpd-run\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.900144 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.900293 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-logs" (OuterVolumeSpecName: "logs") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.900390 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-public-tls-certs\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.901834 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.901969 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-scripts\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.902014 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-config-data\") pod \"2a357fec-26ca-4478-8ec4-34b141dbe886\" (UID: \"2a357fec-26ca-4478-8ec4-34b141dbe886\") " Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.903110 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.903130 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a357fec-26ca-4478-8ec4-34b141dbe886-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.935345 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a357fec-26ca-4478-8ec4-34b141dbe886-kube-api-access-mt2fq" (OuterVolumeSpecName: "kube-api-access-mt2fq") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "kube-api-access-mt2fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.935459 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-scripts" (OuterVolumeSpecName: "scripts") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.987776 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e" (OuterVolumeSpecName: "glance") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "pvc-2f5442b2-466c-497d-97f0-c22697b04d0e". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4762]: I0217 14:31:53.988194 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.010091 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2fq\" (UniqueName: \"kubernetes.io/projected/2a357fec-26ca-4478-8ec4-34b141dbe886-kube-api-access-mt2fq\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.010122 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.010190 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") on node \"crc\" " Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.010207 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.035931 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2a357fec-26ca-4478-8ec4-34b141dbe886","Type":"ContainerDied","Data":"5882e5f11108e7bb28b49f159bd3440debfcda55922e2e6d17e0c46a9c28451e"} Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.036008 4762 scope.go:117] "RemoveContainer" containerID="0b62a9d98e888b0e0dc59d942af63064b26f4e10cb512add83ab42d2ca101810" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.036157 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.060834 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-config-data" (OuterVolumeSpecName: "config-data") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.083426 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.083604 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2f5442b2-466c-497d-97f0-c22697b04d0e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e") on node "crc" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.096483 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2a357fec-26ca-4478-8ec4-34b141dbe886" (UID: "2a357fec-26ca-4478-8ec4-34b141dbe886"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.120049 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.120087 4762 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a357fec-26ca-4478-8ec4-34b141dbe886-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.120102 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.204334 4762 scope.go:117] "RemoveContainer" containerID="80f9aa22b822f0b15afdc8fa63b813a132cb5897e20b1c25212e7e3ca7e5cd55" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.364341 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.375161 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.394033 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:31:54 crc kubenswrapper[4762]: E0217 14:31:54.394725 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-log" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.394745 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-log" Feb 17 14:31:54 crc kubenswrapper[4762]: E0217 14:31:54.394773 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-httpd" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.394779 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-httpd" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.395057 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-log" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.395081 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" containerName="glance-httpd" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.396432 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.400323 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.400393 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.408345 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.486772 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzsgc"] Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537477 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwk9l\" (UniqueName: \"kubernetes.io/projected/d64001d1-6972-4563-a764-05b359233d62-kube-api-access-gwk9l\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537572 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64001d1-6972-4563-a764-05b359233d62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537739 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-scripts\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537759 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537820 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-config-data\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537857 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537913 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.537937 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64001d1-6972-4563-a764-05b359233d62-logs\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.616322 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6885f6c5bd-nskzc" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.639767 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-scripts\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.639808 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.639884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-config-data\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.639931 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.639997 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.640013 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64001d1-6972-4563-a764-05b359233d62-logs\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.640070 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwk9l\" (UniqueName: \"kubernetes.io/projected/d64001d1-6972-4563-a764-05b359233d62-kube-api-access-gwk9l\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.640136 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64001d1-6972-4563-a764-05b359233d62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.640642 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64001d1-6972-4563-a764-05b359233d62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.641187 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64001d1-6972-4563-a764-05b359233d62-logs\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.649624 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-scripts\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.650115 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.650143 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd98bc01ad401fb0843a9dd71ca408e41c0fbbffed1920afb8717f05abdffdd4/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.650158 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.653007 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.653691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64001d1-6972-4563-a764-05b359233d62-config-data\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.687984 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwk9l\" (UniqueName: \"kubernetes.io/projected/d64001d1-6972-4563-a764-05b359233d62-kube-api-access-gwk9l\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.691964 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6545f49b85-762lt"] Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.827420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f5442b2-466c-497d-97f0-c22697b04d0e\") pod \"glance-default-external-api-0\" (UID: \"d64001d1-6972-4563-a764-05b359233d62\") " pod="openstack/glance-default-external-api-0" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.879138 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:54 crc kubenswrapper[4762]: I0217 14:31:54.947017 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.017213 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.054893 4762 generic.go:334] "Generic (PLEG): container finished" podID="88939b89-be48-48f3-85c6-542eea161552" containerID="9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7" exitCode=0 Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.056486 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerDied","Data":"9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7"} Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.056529 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerStarted","Data":"a0ad6bf3d1e4ed9ac753170c37b216bfbb5084e2fa9aab1377d334d216a8b9da"} Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.132267 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-579766b5b-pgs2q" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.223499 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-68d86764f7-2hn2f"] Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.343780 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-68c7cc4b78-lr6mt" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.367618 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.460409 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-f8f7cc6b-9bscz"] Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.460679 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-f8f7cc6b-9bscz" podUID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" containerName="heat-engine" containerID="cri-o://c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" gracePeriod=60 Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.498608 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lftlj\" (UniqueName: \"kubernetes.io/projected/abea76c2-c351-4c12-85c0-fb86db09cdd1-kube-api-access-lftlj\") pod \"abea76c2-c351-4c12-85c0-fb86db09cdd1\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.498707 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-combined-ca-bundle\") pod \"abea76c2-c351-4c12-85c0-fb86db09cdd1\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.498981 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data\") pod \"abea76c2-c351-4c12-85c0-fb86db09cdd1\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.499056 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data-custom\") pod \"abea76c2-c351-4c12-85c0-fb86db09cdd1\" (UID: \"abea76c2-c351-4c12-85c0-fb86db09cdd1\") " Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.511037 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "abea76c2-c351-4c12-85c0-fb86db09cdd1" (UID: "abea76c2-c351-4c12-85c0-fb86db09cdd1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.546219 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abea76c2-c351-4c12-85c0-fb86db09cdd1-kube-api-access-lftlj" (OuterVolumeSpecName: "kube-api-access-lftlj") pod "abea76c2-c351-4c12-85c0-fb86db09cdd1" (UID: "abea76c2-c351-4c12-85c0-fb86db09cdd1"). InnerVolumeSpecName "kube-api-access-lftlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.579766 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abea76c2-c351-4c12-85c0-fb86db09cdd1" (UID: "abea76c2-c351-4c12-85c0-fb86db09cdd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.604069 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.604108 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lftlj\" (UniqueName: \"kubernetes.io/projected/abea76c2-c351-4c12-85c0-fb86db09cdd1-kube-api-access-lftlj\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.604138 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.641883 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data" (OuterVolumeSpecName: "config-data") pod "abea76c2-c351-4c12-85c0-fb86db09cdd1" (UID: "abea76c2-c351-4c12-85c0-fb86db09cdd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.706256 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abea76c2-c351-4c12-85c0-fb86db09cdd1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:55 crc kubenswrapper[4762]: I0217 14:31:55.876339 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.116835 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a357fec-26ca-4478-8ec4-34b141dbe886" path="/var/lib/kubelet/pods/2a357fec-26ca-4478-8ec4-34b141dbe886/volumes" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.132869 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerStarted","Data":"68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785"} Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.152213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d64001d1-6972-4563-a764-05b359233d62","Type":"ContainerStarted","Data":"502725fb4cd4caeda80b2f858377b232cf147440cdcc0ef7a07b73d0b7ecff25"} Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.152960 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.159437 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" event={"ID":"19953d0a-f2bb-4e7c-b5fc-44218a467dc9","Type":"ContainerDied","Data":"a992d57ddd1f55ad229d97f1aae1c95c31f7850e69056aebe3c1ea53d0645cd6"} Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.159512 4762 scope.go:117] "RemoveContainer" containerID="1f48453362b4625d2d24b4d8bb01866718fe46fa95778547f956a6da7fb33667" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.168173 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6545f49b85-762lt" event={"ID":"abea76c2-c351-4c12-85c0-fb86db09cdd1","Type":"ContainerDied","Data":"cd044d0be7f349e3f9b44c9a5f711eb99d541fed131316eb937a343639bfc54d"} Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.168272 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6545f49b85-762lt" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.207714 4762 scope.go:117] "RemoveContainer" containerID="dd43b889ee7e21f1e1a649f2868838306f495dfd5e53582ad34ca0747b4409cd" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.242638 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data-custom\") pod \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.243196 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-combined-ca-bundle\") pod \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.243355 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data\") pod \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.243478 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpmzc\" (UniqueName: \"kubernetes.io/projected/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-kube-api-access-qpmzc\") pod \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\" (UID: \"19953d0a-f2bb-4e7c-b5fc-44218a467dc9\") " Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.265672 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-kube-api-access-qpmzc" (OuterVolumeSpecName: "kube-api-access-qpmzc") pod "19953d0a-f2bb-4e7c-b5fc-44218a467dc9" (UID: "19953d0a-f2bb-4e7c-b5fc-44218a467dc9"). InnerVolumeSpecName "kube-api-access-qpmzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.268227 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "19953d0a-f2bb-4e7c-b5fc-44218a467dc9" (UID: "19953d0a-f2bb-4e7c-b5fc-44218a467dc9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.299376 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19953d0a-f2bb-4e7c-b5fc-44218a467dc9" (UID: "19953d0a-f2bb-4e7c-b5fc-44218a467dc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.327284 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6545f49b85-762lt"] Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.346869 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data" (OuterVolumeSpecName: "config-data") pod "19953d0a-f2bb-4e7c-b5fc-44218a467dc9" (UID: "19953d0a-f2bb-4e7c-b5fc-44218a467dc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.348490 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpmzc\" (UniqueName: \"kubernetes.io/projected/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-kube-api-access-qpmzc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.348528 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.348537 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.348545 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19953d0a-f2bb-4e7c-b5fc-44218a467dc9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.351950 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6545f49b85-762lt"] Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.533003 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x5bg"] Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.533241 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5x5bg" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="registry-server" containerID="cri-o://2c899ca16dbffc9ffd16c176d1a5962956dfca67f29dc0f5ed988a1d66008235" gracePeriod=2 Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.954757 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7x82n"] Feb 17 14:31:56 crc kubenswrapper[4762]: E0217 14:31:56.955830 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerName="heat-cfnapi" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.955854 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerName="heat-cfnapi" Feb 17 14:31:56 crc kubenswrapper[4762]: E0217 14:31:56.955872 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerName="heat-api" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.955878 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerName="heat-api" Feb 17 14:31:56 crc kubenswrapper[4762]: E0217 14:31:56.955909 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerName="heat-cfnapi" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.955918 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerName="heat-cfnapi" Feb 17 14:31:56 crc kubenswrapper[4762]: E0217 14:31:56.955948 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerName="heat-api" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.955956 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerName="heat-api" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.956227 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerName="heat-cfnapi" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.956259 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerName="heat-api" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.956271 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" containerName="heat-api" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.957325 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.962521 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.962807 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 14:31:56 crc kubenswrapper[4762]: I0217 14:31:56.971260 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rh8rn" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.037234 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7x82n"] Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.100946 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-config-data\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.101418 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.101462 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlpxg\" (UniqueName: \"kubernetes.io/projected/92bb66fd-cea7-435b-8915-0641110c25af-kube-api-access-mlpxg\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.101494 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-scripts\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.191272 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68d86764f7-2hn2f" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.200504 4762 generic.go:334] "Generic (PLEG): container finished" podID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerID="2c899ca16dbffc9ffd16c176d1a5962956dfca67f29dc0f5ed988a1d66008235" exitCode=0 Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.200633 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerDied","Data":"2c899ca16dbffc9ffd16c176d1a5962956dfca67f29dc0f5ed988a1d66008235"} Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.200723 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x5bg" event={"ID":"aa5772d9-8e9a-473a-a36b-f93c2b269ce5","Type":"ContainerDied","Data":"fb666b90112391b53b4eac87a2636d25dbb4ec3b615ea1a973331fc2b6dc2d49"} Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.200740 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb666b90112391b53b4eac87a2636d25dbb4ec3b615ea1a973331fc2b6dc2d49" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.203794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.203851 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlpxg\" (UniqueName: \"kubernetes.io/projected/92bb66fd-cea7-435b-8915-0641110c25af-kube-api-access-mlpxg\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.203893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-scripts\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.203979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-config-data\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.207585 4762 generic.go:334] "Generic (PLEG): container finished" podID="88939b89-be48-48f3-85c6-542eea161552" containerID="68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785" exitCode=0 Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.207768 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerDied","Data":"68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785"} Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.209094 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-config-data\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.214879 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.218489 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-scripts\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.219766 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d64001d1-6972-4563-a764-05b359233d62","Type":"ContainerStarted","Data":"e8605069a9e86af53ac6ae506dd68fc33796048b69411c022cd743041fe38c79"} Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.249616 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlpxg\" (UniqueName: \"kubernetes.io/projected/92bb66fd-cea7-435b-8915-0641110c25af-kube-api-access-mlpxg\") pod \"nova-cell0-conductor-db-sync-7x82n\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.255792 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.288431 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-68d86764f7-2hn2f"] Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.305384 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-68d86764f7-2hn2f"] Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.305688 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk76k\" (UniqueName: \"kubernetes.io/projected/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-kube-api-access-wk76k\") pod \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.305843 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-catalog-content\") pod \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.305934 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-utilities\") pod \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\" (UID: \"aa5772d9-8e9a-473a-a36b-f93c2b269ce5\") " Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.306740 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-utilities" (OuterVolumeSpecName: "utilities") pod "aa5772d9-8e9a-473a-a36b-f93c2b269ce5" (UID: "aa5772d9-8e9a-473a-a36b-f93c2b269ce5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.310260 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-kube-api-access-wk76k" (OuterVolumeSpecName: "kube-api-access-wk76k") pod "aa5772d9-8e9a-473a-a36b-f93c2b269ce5" (UID: "aa5772d9-8e9a-473a-a36b-f93c2b269ce5"). InnerVolumeSpecName "kube-api-access-wk76k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.310345 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.343299 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa5772d9-8e9a-473a-a36b-f93c2b269ce5" (UID: "aa5772d9-8e9a-473a-a36b-f93c2b269ce5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.412360 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk76k\" (UniqueName: \"kubernetes.io/projected/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-kube-api-access-wk76k\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.412419 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5772d9-8e9a-473a-a36b-f93c2b269ce5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:57 crc kubenswrapper[4762]: I0217 14:31:57.520164 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.094622 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" path="/var/lib/kubelet/pods/19953d0a-f2bb-4e7c-b5fc-44218a467dc9/volumes" Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.097932 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abea76c2-c351-4c12-85c0-fb86db09cdd1" path="/var/lib/kubelet/pods/abea76c2-c351-4c12-85c0-fb86db09cdd1/volumes" Feb 17 14:31:58 crc kubenswrapper[4762]: E0217 14:31:58.138815 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 17 14:31:58 crc kubenswrapper[4762]: E0217 14:31:58.162091 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 17 14:31:58 crc kubenswrapper[4762]: E0217 14:31:58.167502 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 17 14:31:58 crc kubenswrapper[4762]: E0217 14:31:58.167578 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-f8f7cc6b-9bscz" podUID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" containerName="heat-engine" Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.235266 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerStarted","Data":"7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684"} Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.238895 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x5bg" Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.240163 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d64001d1-6972-4563-a764-05b359233d62","Type":"ContainerStarted","Data":"366d4ee02274a48ef934aa8a9914403092e90cc652919cfe6f03f80b09e34196"} Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.276350 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7x82n"] Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.298255 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bzsgc" podStartSLOduration=2.705946684 podStartE2EDuration="5.298228856s" podCreationTimestamp="2026-02-17 14:31:53 +0000 UTC" firstStartedPulling="2026-02-17 14:31:55.06421373 +0000 UTC m=+1595.644214372" lastFinishedPulling="2026-02-17 14:31:57.656495892 +0000 UTC m=+1598.236496544" observedRunningTime="2026-02-17 14:31:58.256976688 +0000 UTC m=+1598.836977350" watchObservedRunningTime="2026-02-17 14:31:58.298228856 +0000 UTC m=+1598.878229508" Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.328117 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x5bg"] Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.345625 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x5bg"] Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.348207 4762 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod47460499-0eb9-4fcb-bd2b-8e7084f6f26c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod47460499-0eb9-4fcb-bd2b-8e7084f6f26c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod47460499_0eb9_4fcb_bd2b_8e7084f6f26c.slice" Feb 17 14:31:58 crc kubenswrapper[4762]: E0217 14:31:58.348266 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod47460499-0eb9-4fcb-bd2b-8e7084f6f26c] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod47460499-0eb9-4fcb-bd2b-8e7084f6f26c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod47460499_0eb9_4fcb_bd2b_8e7084f6f26c.slice" pod="openstack/ceilometer-0" podUID="47460499-0eb9-4fcb-bd2b-8e7084f6f26c" Feb 17 14:31:58 crc kubenswrapper[4762]: I0217 14:31:58.356539 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.356515835 podStartE2EDuration="4.356515835s" podCreationTimestamp="2026-02-17 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:58.299826489 +0000 UTC m=+1598.879827151" watchObservedRunningTime="2026-02-17 14:31:58.356515835 +0000 UTC m=+1598.936516487" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.263436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7x82n" event={"ID":"92bb66fd-cea7-435b-8915-0641110c25af","Type":"ContainerStarted","Data":"41016d949e76162932db7778103baeb307f4ba5d546e5f90e4e976dbbf4cc162"} Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.264325 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.329081 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.353603 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.374999 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4762]: E0217 14:31:59.375589 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="registry-server" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.375612 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="registry-server" Feb 17 14:31:59 crc kubenswrapper[4762]: E0217 14:31:59.375632 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="extract-utilities" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.375639 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="extract-utilities" Feb 17 14:31:59 crc kubenswrapper[4762]: E0217 14:31:59.375674 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="extract-content" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.375682 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="extract-content" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.375963 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="19953d0a-f2bb-4e7c-b5fc-44218a467dc9" containerName="heat-cfnapi" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.375979 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" containerName="registry-server" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.378287 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.386099 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.386308 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.392222 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.463544 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwdj4\" (UniqueName: \"kubernetes.io/projected/3cec934d-bb52-4694-9146-8436ce1a9c1a-kube-api-access-gwdj4\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.463607 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-log-httpd\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.463628 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-run-httpd\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.471867 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-scripts\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.472086 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.472560 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.472625 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-config-data\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579391 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579453 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-config-data\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwdj4\" (UniqueName: \"kubernetes.io/projected/3cec934d-bb52-4694-9146-8436ce1a9c1a-kube-api-access-gwdj4\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579510 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-log-httpd\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579527 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-run-httpd\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-scripts\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.579640 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.581816 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-run-httpd\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.582112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-log-httpd\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.587969 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.589092 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.592088 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-config-data\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.601301 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-scripts\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.643542 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwdj4\" (UniqueName: \"kubernetes.io/projected/3cec934d-bb52-4694-9146-8436ce1a9c1a-kube-api-access-gwdj4\") pod \"ceilometer-0\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.718606 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.943342 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:31:59 crc kubenswrapper[4762]: I0217 14:31:59.943750 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.004334 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.006147 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.092066 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47460499-0eb9-4fcb-bd2b-8e7084f6f26c" path="/var/lib/kubelet/pods/47460499-0eb9-4fcb-bd2b-8e7084f6f26c/volumes" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.092924 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5772d9-8e9a-473a-a36b-f93c2b269ce5" path="/var/lib/kubelet/pods/aa5772d9-8e9a-473a-a36b-f93c2b269ce5/volumes" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.276816 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.277155 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:00 crc kubenswrapper[4762]: I0217 14:32:00.347772 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:00 crc kubenswrapper[4762]: W0217 14:32:00.355971 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cec934d_bb52_4694_9146_8436ce1a9c1a.slice/crio-44833e809ef88b8ccdb2305df2d2867adba6882f504119ad3f7316f87929b462 WatchSource:0}: Error finding container 44833e809ef88b8ccdb2305df2d2867adba6882f504119ad3f7316f87929b462: Status 404 returned error can't find the container with id 44833e809ef88b8ccdb2305df2d2867adba6882f504119ad3f7316f87929b462 Feb 17 14:32:01 crc kubenswrapper[4762]: I0217 14:32:01.291374 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerStarted","Data":"ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a"} Feb 17 14:32:01 crc kubenswrapper[4762]: I0217 14:32:01.291726 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerStarted","Data":"44833e809ef88b8ccdb2305df2d2867adba6882f504119ad3f7316f87929b462"} Feb 17 14:32:01 crc kubenswrapper[4762]: E0217 14:32:01.622049 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37fd57d6_2520_488b_9ce4_c316d6d62bc5.slice/crio-conmon-c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.007802 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.019405 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.169875 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data-custom\") pod \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.170286 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-combined-ca-bundle\") pod \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.170542 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data\") pod \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.170567 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d56hq\" (UniqueName: \"kubernetes.io/projected/37fd57d6-2520-488b-9ce4-c316d6d62bc5-kube-api-access-d56hq\") pod \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\" (UID: \"37fd57d6-2520-488b-9ce4-c316d6d62bc5\") " Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.198907 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "37fd57d6-2520-488b-9ce4-c316d6d62bc5" (UID: "37fd57d6-2520-488b-9ce4-c316d6d62bc5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.205716 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37fd57d6-2520-488b-9ce4-c316d6d62bc5-kube-api-access-d56hq" (OuterVolumeSpecName: "kube-api-access-d56hq") pod "37fd57d6-2520-488b-9ce4-c316d6d62bc5" (UID: "37fd57d6-2520-488b-9ce4-c316d6d62bc5"). InnerVolumeSpecName "kube-api-access-d56hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.225973 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37fd57d6-2520-488b-9ce4-c316d6d62bc5" (UID: "37fd57d6-2520-488b-9ce4-c316d6d62bc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.257794 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data" (OuterVolumeSpecName: "config-data") pod "37fd57d6-2520-488b-9ce4-c316d6d62bc5" (UID: "37fd57d6-2520-488b-9ce4-c316d6d62bc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.275132 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.275437 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.275526 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37fd57d6-2520-488b-9ce4-c316d6d62bc5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.275603 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d56hq\" (UniqueName: \"kubernetes.io/projected/37fd57d6-2520-488b-9ce4-c316d6d62bc5-kube-api-access-d56hq\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.308631 4762 generic.go:334] "Generic (PLEG): container finished" podID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" exitCode=0 Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.308742 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f8f7cc6b-9bscz" event={"ID":"37fd57d6-2520-488b-9ce4-c316d6d62bc5","Type":"ContainerDied","Data":"c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421"} Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.308883 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f8f7cc6b-9bscz" event={"ID":"37fd57d6-2520-488b-9ce4-c316d6d62bc5","Type":"ContainerDied","Data":"cce4265dee757d4d3c19fd2007ddbb035894233315f7cfd4bc4fd2ea8cafa854"} Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.308911 4762 scope.go:117] "RemoveContainer" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.309862 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f8f7cc6b-9bscz" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.317043 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerStarted","Data":"59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027"} Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.415802 4762 scope.go:117] "RemoveContainer" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" Feb 17 14:32:02 crc kubenswrapper[4762]: E0217 14:32:02.416223 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421\": container with ID starting with c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421 not found: ID does not exist" containerID="c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.416253 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421"} err="failed to get container status \"c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421\": rpc error: code = NotFound desc = could not find container \"c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421\": container with ID starting with c175cdb4390ae02c446a8daea4868584ccc1a68599568031f4c1ded03f0e6421 not found: ID does not exist" Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.439054 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-f8f7cc6b-9bscz"] Feb 17 14:32:02 crc kubenswrapper[4762]: I0217 14:32:02.455602 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-f8f7cc6b-9bscz"] Feb 17 14:32:03 crc kubenswrapper[4762]: I0217 14:32:03.373877 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerStarted","Data":"1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4"} Feb 17 14:32:03 crc kubenswrapper[4762]: I0217 14:32:03.728116 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:32:03 crc kubenswrapper[4762]: I0217 14:32:03.728493 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.102314 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" path="/var/lib/kubelet/pods/37fd57d6-2520-488b-9ce4-c316d6d62bc5/volumes" Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.430276 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerStarted","Data":"9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9"} Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.430931 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-central-agent" containerID="cri-o://ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a" gracePeriod=30 Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.431277 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.431787 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="proxy-httpd" containerID="cri-o://9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9" gracePeriod=30 Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.431890 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="sg-core" containerID="cri-o://1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4" gracePeriod=30 Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.431943 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-notification-agent" containerID="cri-o://59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027" gracePeriod=30 Feb 17 14:32:04 crc kubenswrapper[4762]: I0217 14:32:04.795745 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bzsgc" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="registry-server" probeResult="failure" output=< Feb 17 14:32:04 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:32:04 crc kubenswrapper[4762]: > Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.017959 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.018266 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.058260 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.079473 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.095907 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.504056159 podStartE2EDuration="6.095882658s" podCreationTimestamp="2026-02-17 14:31:59 +0000 UTC" firstStartedPulling="2026-02-17 14:32:00.359582517 +0000 UTC m=+1600.939583169" lastFinishedPulling="2026-02-17 14:32:03.951409006 +0000 UTC m=+1604.531409668" observedRunningTime="2026-02-17 14:32:04.47686525 +0000 UTC m=+1605.056865912" watchObservedRunningTime="2026-02-17 14:32:05.095882658 +0000 UTC m=+1605.675883310" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.249047 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.249213 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.348256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.451342 4762 generic.go:334] "Generic (PLEG): container finished" podID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerID="1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4" exitCode=2 Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.451385 4762 generic.go:334] "Generic (PLEG): container finished" podID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerID="59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027" exitCode=0 Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.451398 4762 generic.go:334] "Generic (PLEG): container finished" podID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerID="ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a" exitCode=0 Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.451456 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerDied","Data":"1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4"} Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.451525 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerDied","Data":"59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027"} Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.451540 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerDied","Data":"ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a"} Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.452275 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:32:05 crc kubenswrapper[4762]: I0217 14:32:05.452315 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:32:07 crc kubenswrapper[4762]: I0217 14:32:07.071375 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:32:07 crc kubenswrapper[4762]: E0217 14:32:07.072093 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:32:08 crc kubenswrapper[4762]: I0217 14:32:08.038436 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:32:08 crc kubenswrapper[4762]: I0217 14:32:08.038955 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:32:08 crc kubenswrapper[4762]: I0217 14:32:08.389570 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:32:13 crc kubenswrapper[4762]: I0217 14:32:13.786572 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:32:13 crc kubenswrapper[4762]: I0217 14:32:13.844657 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:32:14 crc kubenswrapper[4762]: I0217 14:32:14.143211 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bzsgc"] Feb 17 14:32:15 crc kubenswrapper[4762]: I0217 14:32:15.748705 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bzsgc" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="registry-server" containerID="cri-o://7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684" gracePeriod=2 Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.280130 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.453272 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpctl\" (UniqueName: \"kubernetes.io/projected/88939b89-be48-48f3-85c6-542eea161552-kube-api-access-rpctl\") pod \"88939b89-be48-48f3-85c6-542eea161552\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.453361 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-catalog-content\") pod \"88939b89-be48-48f3-85c6-542eea161552\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.453507 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-utilities\") pod \"88939b89-be48-48f3-85c6-542eea161552\" (UID: \"88939b89-be48-48f3-85c6-542eea161552\") " Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.454165 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-utilities" (OuterVolumeSpecName: "utilities") pod "88939b89-be48-48f3-85c6-542eea161552" (UID: "88939b89-be48-48f3-85c6-542eea161552"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.454381 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.459885 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88939b89-be48-48f3-85c6-542eea161552-kube-api-access-rpctl" (OuterVolumeSpecName: "kube-api-access-rpctl") pod "88939b89-be48-48f3-85c6-542eea161552" (UID: "88939b89-be48-48f3-85c6-542eea161552"). InnerVolumeSpecName "kube-api-access-rpctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.509548 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88939b89-be48-48f3-85c6-542eea161552" (UID: "88939b89-be48-48f3-85c6-542eea161552"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.556741 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpctl\" (UniqueName: \"kubernetes.io/projected/88939b89-be48-48f3-85c6-542eea161552-kube-api-access-rpctl\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.556776 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88939b89-be48-48f3-85c6-542eea161552-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.763018 4762 generic.go:334] "Generic (PLEG): container finished" podID="88939b89-be48-48f3-85c6-542eea161552" containerID="7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684" exitCode=0 Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.763081 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerDied","Data":"7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684"} Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.763132 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzsgc" event={"ID":"88939b89-be48-48f3-85c6-542eea161552","Type":"ContainerDied","Data":"a0ad6bf3d1e4ed9ac753170c37b216bfbb5084e2fa9aab1377d334d216a8b9da"} Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.763154 4762 scope.go:117] "RemoveContainer" containerID="7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.764228 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzsgc" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.765506 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7x82n" event={"ID":"92bb66fd-cea7-435b-8915-0641110c25af","Type":"ContainerStarted","Data":"d8df3855e0f6149ffd61f131162f7a26a55a32bd0885c8d0d06c0ea10669f091"} Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.794057 4762 scope.go:117] "RemoveContainer" containerID="68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.810000 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7x82n" podStartSLOduration=3.609285271 podStartE2EDuration="20.809972343s" podCreationTimestamp="2026-02-17 14:31:56 +0000 UTC" firstStartedPulling="2026-02-17 14:31:58.28213823 +0000 UTC m=+1598.862138882" lastFinishedPulling="2026-02-17 14:32:15.482825302 +0000 UTC m=+1616.062825954" observedRunningTime="2026-02-17 14:32:16.795530272 +0000 UTC m=+1617.375530924" watchObservedRunningTime="2026-02-17 14:32:16.809972343 +0000 UTC m=+1617.389973005" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.830379 4762 scope.go:117] "RemoveContainer" containerID="9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.853612 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bzsgc"] Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.873165 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bzsgc"] Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.892911 4762 scope.go:117] "RemoveContainer" containerID="7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684" Feb 17 14:32:16 crc kubenswrapper[4762]: E0217 14:32:16.893523 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684\": container with ID starting with 7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684 not found: ID does not exist" containerID="7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.893572 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684"} err="failed to get container status \"7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684\": rpc error: code = NotFound desc = could not find container \"7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684\": container with ID starting with 7eaf1070d67692ed4120d239a4047a61725689b43c6055744e1499338201f684 not found: ID does not exist" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.893625 4762 scope.go:117] "RemoveContainer" containerID="68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785" Feb 17 14:32:16 crc kubenswrapper[4762]: E0217 14:32:16.894146 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785\": container with ID starting with 68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785 not found: ID does not exist" containerID="68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.894187 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785"} err="failed to get container status \"68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785\": rpc error: code = NotFound desc = could not find container \"68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785\": container with ID starting with 68c86864f8ec2a5c532b41a67f642eb93c4a3dad6de7261e3a671c9c463ca785 not found: ID does not exist" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.894215 4762 scope.go:117] "RemoveContainer" containerID="9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7" Feb 17 14:32:16 crc kubenswrapper[4762]: E0217 14:32:16.894676 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7\": container with ID starting with 9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7 not found: ID does not exist" containerID="9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7" Feb 17 14:32:16 crc kubenswrapper[4762]: I0217 14:32:16.894714 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7"} err="failed to get container status \"9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7\": rpc error: code = NotFound desc = could not find container \"9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7\": container with ID starting with 9d64b81e3921bc3206ab9676e7403a172a046e02f8249882bb208c0c923246d7 not found: ID does not exist" Feb 17 14:32:18 crc kubenswrapper[4762]: I0217 14:32:18.086559 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88939b89-be48-48f3-85c6-542eea161552" path="/var/lib/kubelet/pods/88939b89-be48-48f3-85c6-542eea161552/volumes" Feb 17 14:32:19 crc kubenswrapper[4762]: I0217 14:32:19.071720 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:32:19 crc kubenswrapper[4762]: E0217 14:32:19.072104 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:32:27 crc kubenswrapper[4762]: I0217 14:32:27.667593 4762 generic.go:334] "Generic (PLEG): container finished" podID="92bb66fd-cea7-435b-8915-0641110c25af" containerID="d8df3855e0f6149ffd61f131162f7a26a55a32bd0885c8d0d06c0ea10669f091" exitCode=0 Feb 17 14:32:27 crc kubenswrapper[4762]: I0217 14:32:27.667699 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7x82n" event={"ID":"92bb66fd-cea7-435b-8915-0641110c25af","Type":"ContainerDied","Data":"d8df3855e0f6149ffd61f131162f7a26a55a32bd0885c8d0d06c0ea10669f091"} Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.546250 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.725629 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlpxg\" (UniqueName: \"kubernetes.io/projected/92bb66fd-cea7-435b-8915-0641110c25af-kube-api-access-mlpxg\") pod \"92bb66fd-cea7-435b-8915-0641110c25af\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.726054 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-config-data\") pod \"92bb66fd-cea7-435b-8915-0641110c25af\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.726200 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-scripts\") pod \"92bb66fd-cea7-435b-8915-0641110c25af\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.726300 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-combined-ca-bundle\") pod \"92bb66fd-cea7-435b-8915-0641110c25af\" (UID: \"92bb66fd-cea7-435b-8915-0641110c25af\") " Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.734005 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bb66fd-cea7-435b-8915-0641110c25af-kube-api-access-mlpxg" (OuterVolumeSpecName: "kube-api-access-mlpxg") pod "92bb66fd-cea7-435b-8915-0641110c25af" (UID: "92bb66fd-cea7-435b-8915-0641110c25af"). InnerVolumeSpecName "kube-api-access-mlpxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.734082 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.734120 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-scripts" (OuterVolumeSpecName: "scripts") pod "92bb66fd-cea7-435b-8915-0641110c25af" (UID: "92bb66fd-cea7-435b-8915-0641110c25af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.767875 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92bb66fd-cea7-435b-8915-0641110c25af" (UID: "92bb66fd-cea7-435b-8915-0641110c25af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.773534 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-config-data" (OuterVolumeSpecName: "config-data") pod "92bb66fd-cea7-435b-8915-0641110c25af" (UID: "92bb66fd-cea7-435b-8915-0641110c25af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.801068 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7x82n" event={"ID":"92bb66fd-cea7-435b-8915-0641110c25af","Type":"ContainerDied","Data":"41016d949e76162932db7778103baeb307f4ba5d546e5f90e4e976dbbf4cc162"} Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.801117 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41016d949e76162932db7778103baeb307f4ba5d546e5f90e4e976dbbf4cc162" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.801194 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7x82n" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.829202 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.829245 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.829262 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bb66fd-cea7-435b-8915-0641110c25af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:29 crc kubenswrapper[4762]: I0217 14:32:29.829278 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlpxg\" (UniqueName: \"kubernetes.io/projected/92bb66fd-cea7-435b-8915-0641110c25af-kube-api-access-mlpxg\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.292513 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 14:32:30 crc kubenswrapper[4762]: E0217 14:32:30.293157 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="extract-utilities" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293176 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="extract-utilities" Feb 17 14:32:30 crc kubenswrapper[4762]: E0217 14:32:30.293220 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" containerName="heat-engine" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293228 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" containerName="heat-engine" Feb 17 14:32:30 crc kubenswrapper[4762]: E0217 14:32:30.293258 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="registry-server" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293267 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="registry-server" Feb 17 14:32:30 crc kubenswrapper[4762]: E0217 14:32:30.293283 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb66fd-cea7-435b-8915-0641110c25af" containerName="nova-cell0-conductor-db-sync" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293290 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb66fd-cea7-435b-8915-0641110c25af" containerName="nova-cell0-conductor-db-sync" Feb 17 14:32:30 crc kubenswrapper[4762]: E0217 14:32:30.293318 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="extract-content" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293325 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="extract-content" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293562 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bb66fd-cea7-435b-8915-0641110c25af" containerName="nova-cell0-conductor-db-sync" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293584 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="88939b89-be48-48f3-85c6-542eea161552" containerName="registry-server" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.293591 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fd57d6-2520-488b-9ce4-c316d6d62bc5" containerName="heat-engine" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.294576 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.299577 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.299704 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rh8rn" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.310389 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.450676 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.451034 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.451210 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr2xt\" (UniqueName: \"kubernetes.io/projected/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-kube-api-access-dr2xt\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.552936 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.553123 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.553192 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr2xt\" (UniqueName: \"kubernetes.io/projected/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-kube-api-access-dr2xt\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.557475 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.557662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.874915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr2xt\" (UniqueName: \"kubernetes.io/projected/889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0-kube-api-access-dr2xt\") pod \"nova-cell0-conductor-0\" (UID: \"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:30 crc kubenswrapper[4762]: I0217 14:32:30.924306 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:31 crc kubenswrapper[4762]: I0217 14:32:31.072245 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:32:31 crc kubenswrapper[4762]: E0217 14:32:31.072880 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:32:31 crc kubenswrapper[4762]: I0217 14:32:31.860975 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 14:32:31 crc kubenswrapper[4762]: I0217 14:32:31.910580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0","Type":"ContainerStarted","Data":"d70c31dd5f420bf543a78fa19ebd8577fde8c99a0af4896fa9f76da840f861e2"} Feb 17 14:32:32 crc kubenswrapper[4762]: I0217 14:32:32.946333 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0","Type":"ContainerStarted","Data":"f02c3aacf719c206308aa2fa26ac97bc2ec500dccf1d3ffd042b65fb7455e716"} Feb 17 14:32:32 crc kubenswrapper[4762]: I0217 14:32:32.946720 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:32 crc kubenswrapper[4762]: I0217 14:32:32.975622 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.975597025 podStartE2EDuration="2.975597025s" podCreationTimestamp="2026-02-17 14:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:32.964539876 +0000 UTC m=+1633.544540528" watchObservedRunningTime="2026-02-17 14:32:32.975597025 +0000 UTC m=+1633.555597677" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.022878 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.144347 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-combined-ca-bundle\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.144512 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-sg-core-conf-yaml\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.144623 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-config-data\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.145478 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-log-httpd\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.146207 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.146361 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwdj4\" (UniqueName: \"kubernetes.io/projected/3cec934d-bb52-4694-9146-8436ce1a9c1a-kube-api-access-gwdj4\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.146395 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-scripts\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.146461 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-run-httpd\") pod \"3cec934d-bb52-4694-9146-8436ce1a9c1a\" (UID: \"3cec934d-bb52-4694-9146-8436ce1a9c1a\") " Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.147407 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.147947 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.150711 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cec934d-bb52-4694-9146-8436ce1a9c1a-kube-api-access-gwdj4" (OuterVolumeSpecName: "kube-api-access-gwdj4") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "kube-api-access-gwdj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.154761 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-scripts" (OuterVolumeSpecName: "scripts") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.185496 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.241351 4762 generic.go:334] "Generic (PLEG): container finished" podID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerID="9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9" exitCode=137 Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.241395 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerDied","Data":"9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9"} Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.241422 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cec934d-bb52-4694-9146-8436ce1a9c1a","Type":"ContainerDied","Data":"44833e809ef88b8ccdb2305df2d2867adba6882f504119ad3f7316f87929b462"} Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.241441 4762 scope.go:117] "RemoveContainer" containerID="9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.241554 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.250915 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.251891 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwdj4\" (UniqueName: \"kubernetes.io/projected/3cec934d-bb52-4694-9146-8436ce1a9c1a-kube-api-access-gwdj4\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.251908 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cec934d-bb52-4694-9146-8436ce1a9c1a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.251921 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.254319 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.286982 4762 scope.go:117] "RemoveContainer" containerID="1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.292785 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-config-data" (OuterVolumeSpecName: "config-data") pod "3cec934d-bb52-4694-9146-8436ce1a9c1a" (UID: "3cec934d-bb52-4694-9146-8436ce1a9c1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.310086 4762 scope.go:117] "RemoveContainer" containerID="59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.330682 4762 scope.go:117] "RemoveContainer" containerID="ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.351803 4762 scope.go:117] "RemoveContainer" containerID="9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.352237 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9\": container with ID starting with 9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9 not found: ID does not exist" containerID="9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.352297 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9"} err="failed to get container status \"9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9\": rpc error: code = NotFound desc = could not find container \"9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9\": container with ID starting with 9c5102aa286500894f1d0f21823753fe8a84b7d4cf8ea83dc97ba4daec64bfc9 not found: ID does not exist" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.352322 4762 scope.go:117] "RemoveContainer" containerID="1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.352551 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4\": container with ID starting with 1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4 not found: ID does not exist" containerID="1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.352584 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4"} err="failed to get container status \"1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4\": rpc error: code = NotFound desc = could not find container \"1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4\": container with ID starting with 1b526ad0d3c47e9ba3b13f4edece2d06caec8a8b70915d36cd694553e3e80ae4 not found: ID does not exist" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.352605 4762 scope.go:117] "RemoveContainer" containerID="59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.352890 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027\": container with ID starting with 59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027 not found: ID does not exist" containerID="59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.352907 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027"} err="failed to get container status \"59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027\": rpc error: code = NotFound desc = could not find container \"59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027\": container with ID starting with 59fa38b1750545ec00f5bd4a750dea98f02e6f6079dbc74ba34c06f8889e0027 not found: ID does not exist" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.352920 4762 scope.go:117] "RemoveContainer" containerID="ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.353078 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a\": container with ID starting with ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a not found: ID does not exist" containerID="ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.353096 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a"} err="failed to get container status \"ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a\": rpc error: code = NotFound desc = could not find container \"ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a\": container with ID starting with ef2801382df47aea70243d1329ce0d48c6ea1c3b2bc2a708a43f2c129d31fb2a not found: ID does not exist" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.353886 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.353922 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cec934d-bb52-4694-9146-8436ce1a9c1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.632454 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.960980 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.993729 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.994241 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="proxy-httpd" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994262 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="proxy-httpd" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.994301 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-central-agent" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994308 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-central-agent" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.994319 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="sg-core" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994325 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="sg-core" Feb 17 14:32:35 crc kubenswrapper[4762]: E0217 14:32:35.994347 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-notification-agent" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994354 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-notification-agent" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994573 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-notification-agent" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994596 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="proxy-httpd" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994608 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="ceilometer-central-agent" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.994624 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" containerName="sg-core" Feb 17 14:32:35 crc kubenswrapper[4762]: I0217 14:32:35.997275 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.001765 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.001982 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.014458 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087027 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cec934d-bb52-4694-9146-8436ce1a9c1a" path="/var/lib/kubelet/pods/3cec934d-bb52-4694-9146-8436ce1a9c1a/volumes" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087229 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-log-httpd\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087331 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-config-data\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087568 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-scripts\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljdhx\" (UniqueName: \"kubernetes.io/projected/d485c47e-bce9-40a7-8a87-4b337f908b48-kube-api-access-ljdhx\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087890 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-run-httpd\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.087953 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.190555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-log-httpd\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.190679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.190786 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-config-data\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.190839 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-scripts\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.190936 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljdhx\" (UniqueName: \"kubernetes.io/projected/d485c47e-bce9-40a7-8a87-4b337f908b48-kube-api-access-ljdhx\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.190983 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-run-httpd\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.191051 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.191315 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-log-httpd\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.191797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-run-httpd\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.196603 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-scripts\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.196725 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.197801 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.198757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-config-data\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.212832 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljdhx\" (UniqueName: \"kubernetes.io/projected/d485c47e-bce9-40a7-8a87-4b337f908b48-kube-api-access-ljdhx\") pod \"ceilometer-0\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " pod="openstack/ceilometer-0" Feb 17 14:32:36 crc kubenswrapper[4762]: I0217 14:32:36.600943 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:32:37 crc kubenswrapper[4762]: I0217 14:32:37.452440 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:37 crc kubenswrapper[4762]: I0217 14:32:37.490893 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:38 crc kubenswrapper[4762]: I0217 14:32:38.393272 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerStarted","Data":"f6ed86882b8a6fc97ef15682de3e38aa93b3d6ba89042608649ec488ff9de44b"} Feb 17 14:32:38 crc kubenswrapper[4762]: I0217 14:32:38.393705 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerStarted","Data":"385f7ee76b29aeefcf94df508b106460b68ac231c2258f670aa35452bc572a81"} Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.789687 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pxsl2"] Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.794879 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerStarted","Data":"1959b3497a489f7f2471031234df2e8f3d9f1f74c04b832f1f4889c159828db8"} Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.795066 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.805057 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxsl2"] Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.897759 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-catalog-content\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.897993 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxsl\" (UniqueName: \"kubernetes.io/projected/f5d305d0-ab00-4c29-b7d4-687dd2e46193-kube-api-access-bxxsl\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.903267 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-utilities\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.915234 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0adb-account-create-update-v2qxg"] Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.917163 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.935181 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.948743 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-phqhg"] Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.950829 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.963806 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0adb-account-create-update-v2qxg"] Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.979031 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-phqhg"] Feb 17 14:32:40 crc kubenswrapper[4762]: I0217 14:32:40.984108 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.021818 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-operator-scripts\") pod \"aodh-db-create-phqhg\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.021980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-utilities\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.022178 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-catalog-content\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.022304 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxsl\" (UniqueName: \"kubernetes.io/projected/f5d305d0-ab00-4c29-b7d4-687dd2e46193-kube-api-access-bxxsl\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.022329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blq2c\" (UniqueName: \"kubernetes.io/projected/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-kube-api-access-blq2c\") pod \"aodh-db-create-phqhg\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.023760 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-catalog-content\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.029081 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-utilities\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.045695 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxsl\" (UniqueName: \"kubernetes.io/projected/f5d305d0-ab00-4c29-b7d4-687dd2e46193-kube-api-access-bxxsl\") pod \"community-operators-pxsl2\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.127583 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blq2c\" (UniqueName: \"kubernetes.io/projected/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-kube-api-access-blq2c\") pod \"aodh-db-create-phqhg\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.127736 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5722df-f962-403c-abfa-793bc821be57-operator-scripts\") pod \"aodh-0adb-account-create-update-v2qxg\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.127828 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-operator-scripts\") pod \"aodh-db-create-phqhg\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.127957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzl6d\" (UniqueName: \"kubernetes.io/projected/5b5722df-f962-403c-abfa-793bc821be57-kube-api-access-mzl6d\") pod \"aodh-0adb-account-create-update-v2qxg\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.133395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.135283 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-operator-scripts\") pod \"aodh-db-create-phqhg\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.179405 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blq2c\" (UniqueName: \"kubernetes.io/projected/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-kube-api-access-blq2c\") pod \"aodh-db-create-phqhg\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.231972 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5722df-f962-403c-abfa-793bc821be57-operator-scripts\") pod \"aodh-0adb-account-create-update-v2qxg\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.232178 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzl6d\" (UniqueName: \"kubernetes.io/projected/5b5722df-f962-403c-abfa-793bc821be57-kube-api-access-mzl6d\") pod \"aodh-0adb-account-create-update-v2qxg\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.233592 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5722df-f962-403c-abfa-793bc821be57-operator-scripts\") pod \"aodh-0adb-account-create-update-v2qxg\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.256920 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzl6d\" (UniqueName: \"kubernetes.io/projected/5b5722df-f962-403c-abfa-793bc821be57-kube-api-access-mzl6d\") pod \"aodh-0adb-account-create-update-v2qxg\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.276854 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:41 crc kubenswrapper[4762]: I0217 14:32:41.291535 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:41.847972 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerStarted","Data":"e64f3111613e1c77e6b75a922b272b144d351f2b7b739fac7dde6366b2ec1344"} Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.832537 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wdbb8"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.834581 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.841151 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.841438 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.847532 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wdbb8"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.950562 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-scripts\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.950771 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj4p9\" (UniqueName: \"kubernetes.io/projected/a4589d86-754e-46ec-bd8f-412abdf21890-kube-api-access-xj4p9\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.950897 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-config-data\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:43.950953 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.035183 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.048606 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.053281 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-config-data\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.053366 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.053505 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-scripts\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.055164 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.058815 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj4p9\" (UniqueName: \"kubernetes.io/projected/a4589d86-754e-46ec-bd8f-412abdf21890-kube-api-access-xj4p9\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.071517 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-scripts\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.071582 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-config-data\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.074470 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.410679 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548t8\" (UniqueName: \"kubernetes.io/projected/95ebdcf8-a028-49e2-b555-6505f8b0765a-kube-api-access-548t8\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.410807 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.410834 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-config-data\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.410959 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ebdcf8-a028-49e2-b555-6505f8b0765a-logs\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.424176 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.438264 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.440955 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.460039 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.462923 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.465043 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj4p9\" (UniqueName: \"kubernetes.io/projected/a4589d86-754e-46ec-bd8f-412abdf21890-kube-api-access-xj4p9\") pod \"nova-cell0-cell-mapping-wdbb8\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539080 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539178 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbwd\" (UniqueName: \"kubernetes.io/projected/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-kube-api-access-rmbwd\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539298 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ebdcf8-a028-49e2-b555-6505f8b0765a-logs\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-config-data\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539429 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548t8\" (UniqueName: \"kubernetes.io/projected/95ebdcf8-a028-49e2-b555-6505f8b0765a-kube-api-access-548t8\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539627 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.539717 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-config-data\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.540122 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ebdcf8-a028-49e2-b555-6505f8b0765a-logs\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.556062 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.563057 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548t8\" (UniqueName: \"kubernetes.io/projected/95ebdcf8-a028-49e2-b555-6505f8b0765a-kube-api-access-548t8\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.564909 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-config-data\") pod \"nova-api-0\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.576579 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.595185 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.595270 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.603510 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.641368 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.641434 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbwd\" (UniqueName: \"kubernetes.io/projected/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-kube-api-access-rmbwd\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.641497 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-config-data\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.653385 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.655805 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-config-data\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.705803 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbwd\" (UniqueName: \"kubernetes.io/projected/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-kube-api-access-rmbwd\") pod \"nova-scheduler-0\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.749457 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3e23dcf-8e71-4876-b67a-7649e342a8f2-logs\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.749537 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjrw\" (UniqueName: \"kubernetes.io/projected/c3e23dcf-8e71-4876-b67a-7649e342a8f2-kube-api-access-hkjrw\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.749575 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.749619 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-config-data\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.777719 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.822863 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.835985 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.837435 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.849361 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.863892 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3e23dcf-8e71-4876-b67a-7649e342a8f2-logs\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.864049 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjrw\" (UniqueName: \"kubernetes.io/projected/c3e23dcf-8e71-4876-b67a-7649e342a8f2-kube-api-access-hkjrw\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.864111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.864161 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-config-data\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.865527 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3e23dcf-8e71-4876-b67a-7649e342a8f2-logs\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.896753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.901682 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjrw\" (UniqueName: \"kubernetes.io/projected/c3e23dcf-8e71-4876-b67a-7649e342a8f2-kube-api-access-hkjrw\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.940299 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-config-data\") pod \"nova-metadata-0\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " pod="openstack/nova-metadata-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.990431 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.990845 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs8wz\" (UniqueName: \"kubernetes.io/projected/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-kube-api-access-cs8wz\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:44 crc kubenswrapper[4762]: I0217 14:32:44.991380 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.009023 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.315201 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.315302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs8wz\" (UniqueName: \"kubernetes.io/projected/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-kube-api-access-cs8wz\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.333531 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.336917 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.342255 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.368593 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-ktxq9"] Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.371103 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.376422 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs8wz\" (UniqueName: \"kubernetes.io/projected/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-kube-api-access-cs8wz\") pod \"nova-cell1-novncproxy-0\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.391234 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-ktxq9"] Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.423894 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:32:45 crc kubenswrapper[4762]: E0217 14:32:45.424304 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.444668 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.444940 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-config\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.450798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.451037 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.451184 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjbl\" (UniqueName: \"kubernetes.io/projected/017f582c-a428-4df1-85e2-955bd88c9b26-kube-api-access-dqjbl\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.451418 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.473675 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxsl2"] Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.565032 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.565318 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-config\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.565444 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.565627 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.565850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjbl\" (UniqueName: \"kubernetes.io/projected/017f582c-a428-4df1-85e2-955bd88c9b26-kube-api-access-dqjbl\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.566127 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.568754 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.574340 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.575121 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-config\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.576254 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.582549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.636373 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0adb-account-create-update-v2qxg"] Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.639143 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjbl\" (UniqueName: \"kubernetes.io/projected/017f582c-a428-4df1-85e2-955bd88c9b26-kube-api-access-dqjbl\") pod \"dnsmasq-dns-568d7fd7cf-ktxq9\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.655957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-phqhg"] Feb 17 14:32:45 crc kubenswrapper[4762]: I0217 14:32:45.738551 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.160809 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.160978 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.373869 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.437259 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerStarted","Data":"29a82c160b6f08ce019366202cc92092b79121a7e11c71afa6a4eecda5aa4133"} Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.474053 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-central-agent" containerID="cri-o://f6ed86882b8a6fc97ef15682de3e38aa93b3d6ba89042608649ec488ff9de44b" gracePeriod=30 Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.474381 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="proxy-httpd" containerID="cri-o://67387f4a707dde3c0a45f58e23b87997dccd841113d7e155a74b27c87b083720" gracePeriod=30 Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.474437 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="sg-core" containerID="cri-o://e64f3111613e1c77e6b75a922b272b144d351f2b7b739fac7dde6366b2ec1344" gracePeriod=30 Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.474447 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.474502 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-notification-agent" containerID="cri-o://1959b3497a489f7f2471031234df2e8f3d9f1f74c04b832f1f4889c159828db8" gracePeriod=30 Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.483125 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wdbb8"] Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.498231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0adb-account-create-update-v2qxg" event={"ID":"5b5722df-f962-403c-abfa-793bc821be57","Type":"ContainerStarted","Data":"0b0e5711845cc252e5e3370c75db21e8d396a872b15d20c85e19ebe28ae9b03d"} Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.689348 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-phqhg" event={"ID":"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55","Type":"ContainerStarted","Data":"0a620c73ad5778e44a78bb4354de481672512d4397f6596b88112a5191fb74ad"} Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.748624 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:46 crc kubenswrapper[4762]: I0217 14:32:46.775506 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.624157848 podStartE2EDuration="11.77546951s" podCreationTimestamp="2026-02-17 14:32:35 +0000 UTC" firstStartedPulling="2026-02-17 14:32:37.444619446 +0000 UTC m=+1638.024620098" lastFinishedPulling="2026-02-17 14:32:44.595931108 +0000 UTC m=+1645.175931760" observedRunningTime="2026-02-17 14:32:46.679975513 +0000 UTC m=+1647.259976165" watchObservedRunningTime="2026-02-17 14:32:46.77546951 +0000 UTC m=+1647.355470162" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.009742 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.101489 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9zsnn"] Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.105073 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.117833 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.123203 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.157743 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9zsnn"] Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.246265 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-scripts\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.246406 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjqt\" (UniqueName: \"kubernetes.io/projected/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-kube-api-access-ckjqt\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.246535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.246612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.352567 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-scripts\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.352920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjqt\" (UniqueName: \"kubernetes.io/projected/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-kube-api-access-ckjqt\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.352975 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.353022 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.365694 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.375282 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-scripts\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.383631 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-ktxq9"] Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.402724 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.422038 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjqt\" (UniqueName: \"kubernetes.io/projected/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-kube-api-access-ckjqt\") pod \"nova-cell1-conductor-db-sync-9zsnn\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.479834 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.750603 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ebdcf8-a028-49e2-b555-6505f8b0765a","Type":"ContainerStarted","Data":"bdfaf0d66f4f4e9b5bb546474a6be765a09ae2bc62c10879ecdda7ba7e7e6620"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.758608 4762 generic.go:334] "Generic (PLEG): container finished" podID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerID="6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb" exitCode=0 Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.758718 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerDied","Data":"6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.766859 4762 generic.go:334] "Generic (PLEG): container finished" podID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerID="e64f3111613e1c77e6b75a922b272b144d351f2b7b739fac7dde6366b2ec1344" exitCode=2 Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.766885 4762 generic.go:334] "Generic (PLEG): container finished" podID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerID="1959b3497a489f7f2471031234df2e8f3d9f1f74c04b832f1f4889c159828db8" exitCode=0 Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.766921 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerStarted","Data":"67387f4a707dde3c0a45f58e23b87997dccd841113d7e155a74b27c87b083720"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.766941 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerDied","Data":"e64f3111613e1c77e6b75a922b272b144d351f2b7b739fac7dde6366b2ec1344"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.766953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerDied","Data":"1959b3497a489f7f2471031234df2e8f3d9f1f74c04b832f1f4889c159828db8"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.768138 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0adb-account-create-update-v2qxg" event={"ID":"5b5722df-f962-403c-abfa-793bc821be57","Type":"ContainerStarted","Data":"6b56d7029a2965e5de4afa01619427cb94928a5a2b9f8f1aa928695001e8cc1d"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.769895 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wdbb8" event={"ID":"a4589d86-754e-46ec-bd8f-412abdf21890","Type":"ContainerStarted","Data":"5e6a6cc960de7e807aea06e474caf0188d0711267cba6796ceb1ff821e24407c"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.771447 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-phqhg" event={"ID":"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55","Type":"ContainerStarted","Data":"73161d86078c8db13cbff44883dd9f44405ed482a55af875f557eee2037e6468"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.773079 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" event={"ID":"017f582c-a428-4df1-85e2-955bd88c9b26","Type":"ContainerStarted","Data":"d6b57840b8086c9e15ec5808e20de28c7ad8a04eff43787bc78252ea4af3a28d"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.773979 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263","Type":"ContainerStarted","Data":"17d1956323d8484a803dc651c21c1bfca2808c75b401feb741d42f57fb0426dd"} Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.829070 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0adb-account-create-update-v2qxg" podStartSLOduration=7.82904468 podStartE2EDuration="7.82904468s" podCreationTimestamp="2026-02-17 14:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:47.822596525 +0000 UTC m=+1648.402597187" watchObservedRunningTime="2026-02-17 14:32:47.82904468 +0000 UTC m=+1648.409045322" Feb 17 14:32:47 crc kubenswrapper[4762]: I0217 14:32:47.847448 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-phqhg" podStartSLOduration=7.847427168 podStartE2EDuration="7.847427168s" podCreationTimestamp="2026-02-17 14:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:47.847110499 +0000 UTC m=+1648.427111151" watchObservedRunningTime="2026-02-17 14:32:47.847427168 +0000 UTC m=+1648.427427820" Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.006411 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.037756 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:48 crc kubenswrapper[4762]: W0217 14:32:48.053450 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e23dcf_8e71_4876_b67a_7649e342a8f2.slice/crio-eab731ebf7af34438a6ea5ae402df0804a8a82145e242ad1e7df81525ae849b4 WatchSource:0}: Error finding container eab731ebf7af34438a6ea5ae402df0804a8a82145e242ad1e7df81525ae849b4: Status 404 returned error can't find the container with id eab731ebf7af34438a6ea5ae402df0804a8a82145e242ad1e7df81525ae849b4 Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.394429 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9zsnn"] Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.795160 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3e23dcf-8e71-4876-b67a-7649e342a8f2","Type":"ContainerStarted","Data":"eab731ebf7af34438a6ea5ae402df0804a8a82145e242ad1e7df81525ae849b4"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.799769 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d6333e0c-df36-41f4-9efa-f3b1c161fa9a","Type":"ContainerStarted","Data":"2c1230549bb0a9c609872f87d553791aef4556bd623645cbd474401369ea51f5"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.813087 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" event={"ID":"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d","Type":"ContainerStarted","Data":"745b57e6bf2efa1b71aa23513113a2fb00baba1fc7cb99b978eda5e9db9a2354"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.813147 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" event={"ID":"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d","Type":"ContainerStarted","Data":"48bb7bc152b68943f9f2875120b6c7c8a7a3a8183af10a16fa848d6a559b8f32"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.823451 4762 generic.go:334] "Generic (PLEG): container finished" podID="5b5722df-f962-403c-abfa-793bc821be57" containerID="6b56d7029a2965e5de4afa01619427cb94928a5a2b9f8f1aa928695001e8cc1d" exitCode=0 Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.823567 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0adb-account-create-update-v2qxg" event={"ID":"5b5722df-f962-403c-abfa-793bc821be57","Type":"ContainerDied","Data":"6b56d7029a2965e5de4afa01619427cb94928a5a2b9f8f1aa928695001e8cc1d"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.846637 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wdbb8" event={"ID":"a4589d86-754e-46ec-bd8f-412abdf21890","Type":"ContainerStarted","Data":"0a7db91915ffc089979e848f81e2557ee1f9543eceec4a23d5f5ea6017f3e657"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.847169 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" podStartSLOduration=1.847142679 podStartE2EDuration="1.847142679s" podCreationTimestamp="2026-02-17 14:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:48.843350297 +0000 UTC m=+1649.423350949" watchObservedRunningTime="2026-02-17 14:32:48.847142679 +0000 UTC m=+1649.427143331" Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.857431 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" containerID="73161d86078c8db13cbff44883dd9f44405ed482a55af875f557eee2037e6468" exitCode=0 Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.857515 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-phqhg" event={"ID":"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55","Type":"ContainerDied","Data":"73161d86078c8db13cbff44883dd9f44405ed482a55af875f557eee2037e6468"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.864280 4762 generic.go:334] "Generic (PLEG): container finished" podID="017f582c-a428-4df1-85e2-955bd88c9b26" containerID="ba1d0114d094f9fc0b08a3e520d6413062ad123cbd491490c0d46ab67c5e0859" exitCode=0 Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.864330 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" event={"ID":"017f582c-a428-4df1-85e2-955bd88c9b26","Type":"ContainerDied","Data":"ba1d0114d094f9fc0b08a3e520d6413062ad123cbd491490c0d46ab67c5e0859"} Feb 17 14:32:48 crc kubenswrapper[4762]: I0217 14:32:48.953535 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wdbb8" podStartSLOduration=5.953507901 podStartE2EDuration="5.953507901s" podCreationTimestamp="2026-02-17 14:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:48.915687566 +0000 UTC m=+1649.495688218" watchObservedRunningTime="2026-02-17 14:32:48.953507901 +0000 UTC m=+1649.533508553" Feb 17 14:32:49 crc kubenswrapper[4762]: I0217 14:32:49.160114 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:32:49 crc kubenswrapper[4762]: I0217 14:32:49.192681 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:49 crc kubenswrapper[4762]: I0217 14:32:49.889864 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerStarted","Data":"b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37"} Feb 17 14:32:49 crc kubenswrapper[4762]: I0217 14:32:49.900229 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" event={"ID":"017f582c-a428-4df1-85e2-955bd88c9b26","Type":"ContainerStarted","Data":"3d963d3c523250d1170368819f3f00deb0ad2568068ffec474e10de1da127b5b"} Feb 17 14:32:49 crc kubenswrapper[4762]: I0217 14:32:49.900389 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:49 crc kubenswrapper[4762]: I0217 14:32:49.951854 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" podStartSLOduration=5.951830525 podStartE2EDuration="5.951830525s" podCreationTimestamp="2026-02-17 14:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:49.941784253 +0000 UTC m=+1650.521784915" watchObservedRunningTime="2026-02-17 14:32:49.951830525 +0000 UTC m=+1650.531831177" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.320833 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.329893 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.435100 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzl6d\" (UniqueName: \"kubernetes.io/projected/5b5722df-f962-403c-abfa-793bc821be57-kube-api-access-mzl6d\") pod \"5b5722df-f962-403c-abfa-793bc821be57\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.435346 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blq2c\" (UniqueName: \"kubernetes.io/projected/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-kube-api-access-blq2c\") pod \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.436273 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5722df-f962-403c-abfa-793bc821be57-operator-scripts\") pod \"5b5722df-f962-403c-abfa-793bc821be57\" (UID: \"5b5722df-f962-403c-abfa-793bc821be57\") " Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.436315 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-operator-scripts\") pod \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\" (UID: \"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55\") " Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.437070 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5722df-f962-403c-abfa-793bc821be57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b5722df-f962-403c-abfa-793bc821be57" (UID: "5b5722df-f962-403c-abfa-793bc821be57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.437668 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" (UID: "3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.460384 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-kube-api-access-blq2c" (OuterVolumeSpecName: "kube-api-access-blq2c") pod "3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" (UID: "3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55"). InnerVolumeSpecName "kube-api-access-blq2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.465234 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5722df-f962-403c-abfa-793bc821be57-kube-api-access-mzl6d" (OuterVolumeSpecName: "kube-api-access-mzl6d") pod "5b5722df-f962-403c-abfa-793bc821be57" (UID: "5b5722df-f962-403c-abfa-793bc821be57"). InnerVolumeSpecName "kube-api-access-mzl6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.539221 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzl6d\" (UniqueName: \"kubernetes.io/projected/5b5722df-f962-403c-abfa-793bc821be57-kube-api-access-mzl6d\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.539256 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blq2c\" (UniqueName: \"kubernetes.io/projected/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-kube-api-access-blq2c\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.539267 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5722df-f962-403c-abfa-793bc821be57-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.539276 4762 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.924439 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-phqhg" event={"ID":"3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55","Type":"ContainerDied","Data":"0a620c73ad5778e44a78bb4354de481672512d4397f6596b88112a5191fb74ad"} Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.924934 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a620c73ad5778e44a78bb4354de481672512d4397f6596b88112a5191fb74ad" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.924476 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-phqhg" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.926951 4762 generic.go:334] "Generic (PLEG): container finished" podID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerID="b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37" exitCode=0 Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.927046 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerDied","Data":"b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37"} Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.935233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0adb-account-create-update-v2qxg" event={"ID":"5b5722df-f962-403c-abfa-793bc821be57","Type":"ContainerDied","Data":"0b0e5711845cc252e5e3370c75db21e8d396a872b15d20c85e19ebe28ae9b03d"} Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.935268 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0e5711845cc252e5e3370c75db21e8d396a872b15d20c85e19ebe28ae9b03d" Feb 17 14:32:51 crc kubenswrapper[4762]: I0217 14:32:51.935278 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0adb-account-create-update-v2qxg" Feb 17 14:32:52 crc kubenswrapper[4762]: I0217 14:32:52.959386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3e23dcf-8e71-4876-b67a-7649e342a8f2","Type":"ContainerStarted","Data":"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348"} Feb 17 14:32:52 crc kubenswrapper[4762]: I0217 14:32:52.962600 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263","Type":"ContainerStarted","Data":"03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e"} Feb 17 14:32:52 crc kubenswrapper[4762]: I0217 14:32:52.967477 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ebdcf8-a028-49e2-b555-6505f8b0765a","Type":"ContainerStarted","Data":"cd83dad5e360685ebc38eca2aca36eb53edbcf6f534129f8b4cc39e91add98cf"} Feb 17 14:32:52 crc kubenswrapper[4762]: I0217 14:32:52.969810 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d6333e0c-df36-41f4-9efa-f3b1c161fa9a","Type":"ContainerStarted","Data":"265976f262e9c2b001b72753aa8e69799c1f6e7118b1c455d40777e503ecc600"} Feb 17 14:32:52 crc kubenswrapper[4762]: I0217 14:32:52.969949 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d6333e0c-df36-41f4-9efa-f3b1c161fa9a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://265976f262e9c2b001b72753aa8e69799c1f6e7118b1c455d40777e503ecc600" gracePeriod=30 Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.017623 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.610305595 podStartE2EDuration="9.017597193s" podCreationTimestamp="2026-02-17 14:32:44 +0000 UTC" firstStartedPulling="2026-02-17 14:32:47.019562893 +0000 UTC m=+1647.599563545" lastFinishedPulling="2026-02-17 14:32:52.426854491 +0000 UTC m=+1653.006855143" observedRunningTime="2026-02-17 14:32:53.00455407 +0000 UTC m=+1653.584554722" watchObservedRunningTime="2026-02-17 14:32:53.017597193 +0000 UTC m=+1653.597597845" Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.040125 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.64191845 podStartE2EDuration="9.040098803s" podCreationTimestamp="2026-02-17 14:32:44 +0000 UTC" firstStartedPulling="2026-02-17 14:32:48.033042996 +0000 UTC m=+1648.613043648" lastFinishedPulling="2026-02-17 14:32:52.431223349 +0000 UTC m=+1653.011224001" observedRunningTime="2026-02-17 14:32:53.029115215 +0000 UTC m=+1653.609115887" watchObservedRunningTime="2026-02-17 14:32:53.040098803 +0000 UTC m=+1653.620099465" Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.982666 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ebdcf8-a028-49e2-b555-6505f8b0765a","Type":"ContainerStarted","Data":"dd1d61e4395f1ee047a795522118292aa07dc39bf280cbf996e58279b1113a81"} Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.985836 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerStarted","Data":"97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615"} Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.988580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3e23dcf-8e71-4876-b67a-7649e342a8f2","Type":"ContainerStarted","Data":"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511"} Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.988688 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-log" containerID="cri-o://2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348" gracePeriod=30 Feb 17 14:32:53 crc kubenswrapper[4762]: I0217 14:32:53.988739 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-metadata" containerID="cri-o://7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511" gracePeriod=30 Feb 17 14:32:54 crc kubenswrapper[4762]: I0217 14:32:54.007040 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.564436333 podStartE2EDuration="11.007013406s" podCreationTimestamp="2026-02-17 14:32:43 +0000 UTC" firstStartedPulling="2026-02-17 14:32:46.98549247 +0000 UTC m=+1647.565493122" lastFinishedPulling="2026-02-17 14:32:52.428069543 +0000 UTC m=+1653.008070195" observedRunningTime="2026-02-17 14:32:54.003504121 +0000 UTC m=+1654.583504773" watchObservedRunningTime="2026-02-17 14:32:54.007013406 +0000 UTC m=+1654.587014058" Feb 17 14:32:54 crc kubenswrapper[4762]: I0217 14:32:54.036431 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pxsl2" podStartSLOduration=9.264427733 podStartE2EDuration="14.036405802s" podCreationTimestamp="2026-02-17 14:32:40 +0000 UTC" firstStartedPulling="2026-02-17 14:32:47.943405568 +0000 UTC m=+1648.523406220" lastFinishedPulling="2026-02-17 14:32:52.715383637 +0000 UTC m=+1653.295384289" observedRunningTime="2026-02-17 14:32:54.031225862 +0000 UTC m=+1654.611226524" watchObservedRunningTime="2026-02-17 14:32:54.036405802 +0000 UTC m=+1654.616406454" Feb 17 14:32:54 crc kubenswrapper[4762]: I0217 14:32:54.060615 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.695817149 podStartE2EDuration="10.060585547s" podCreationTimestamp="2026-02-17 14:32:44 +0000 UTC" firstStartedPulling="2026-02-17 14:32:48.062087713 +0000 UTC m=+1648.642088365" lastFinishedPulling="2026-02-17 14:32:52.426856111 +0000 UTC m=+1653.006856763" observedRunningTime="2026-02-17 14:32:54.049289641 +0000 UTC m=+1654.629290293" watchObservedRunningTime="2026-02-17 14:32:54.060585547 +0000 UTC m=+1654.640586199" Feb 17 14:32:54 crc kubenswrapper[4762]: I0217 14:32:54.841251 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:32:54 crc kubenswrapper[4762]: I0217 14:32:54.841554 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:32:54 crc kubenswrapper[4762]: I0217 14:32:54.901434 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.001892 4762 generic.go:334] "Generic (PLEG): container finished" podID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerID="7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511" exitCode=0 Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.002248 4762 generic.go:334] "Generic (PLEG): container finished" podID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerID="2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348" exitCode=143 Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.001946 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3e23dcf-8e71-4876-b67a-7649e342a8f2","Type":"ContainerDied","Data":"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511"} Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.001958 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.002403 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3e23dcf-8e71-4876-b67a-7649e342a8f2","Type":"ContainerDied","Data":"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348"} Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.002449 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3e23dcf-8e71-4876-b67a-7649e342a8f2","Type":"ContainerDied","Data":"eab731ebf7af34438a6ea5ae402df0804a8a82145e242ad1e7df81525ae849b4"} Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.002481 4762 scope.go:117] "RemoveContainer" containerID="7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.013312 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-config-data\") pod \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.014745 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkjrw\" (UniqueName: \"kubernetes.io/projected/c3e23dcf-8e71-4876-b67a-7649e342a8f2-kube-api-access-hkjrw\") pod \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.014838 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3e23dcf-8e71-4876-b67a-7649e342a8f2-logs\") pod \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.015110 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-combined-ca-bundle\") pod \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\" (UID: \"c3e23dcf-8e71-4876-b67a-7649e342a8f2\") " Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.016983 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3e23dcf-8e71-4876-b67a-7649e342a8f2-logs" (OuterVolumeSpecName: "logs") pod "c3e23dcf-8e71-4876-b67a-7649e342a8f2" (UID: "c3e23dcf-8e71-4876-b67a-7649e342a8f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.045901 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e23dcf-8e71-4876-b67a-7649e342a8f2-kube-api-access-hkjrw" (OuterVolumeSpecName: "kube-api-access-hkjrw") pod "c3e23dcf-8e71-4876-b67a-7649e342a8f2" (UID: "c3e23dcf-8e71-4876-b67a-7649e342a8f2"). InnerVolumeSpecName "kube-api-access-hkjrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.046523 4762 scope.go:117] "RemoveContainer" containerID="2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.074887 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-config-data" (OuterVolumeSpecName: "config-data") pod "c3e23dcf-8e71-4876-b67a-7649e342a8f2" (UID: "c3e23dcf-8e71-4876-b67a-7649e342a8f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.079870 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3e23dcf-8e71-4876-b67a-7649e342a8f2" (UID: "c3e23dcf-8e71-4876-b67a-7649e342a8f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.119383 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.119419 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e23dcf-8e71-4876-b67a-7649e342a8f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.119431 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkjrw\" (UniqueName: \"kubernetes.io/projected/c3e23dcf-8e71-4876-b67a-7649e342a8f2-kube-api-access-hkjrw\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.119444 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3e23dcf-8e71-4876-b67a-7649e342a8f2-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.183788 4762 scope.go:117] "RemoveContainer" containerID="7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511" Feb 17 14:32:55 crc kubenswrapper[4762]: E0217 14:32:55.184406 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511\": container with ID starting with 7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511 not found: ID does not exist" containerID="7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.184440 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511"} err="failed to get container status \"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511\": rpc error: code = NotFound desc = could not find container \"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511\": container with ID starting with 7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511 not found: ID does not exist" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.184464 4762 scope.go:117] "RemoveContainer" containerID="2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348" Feb 17 14:32:55 crc kubenswrapper[4762]: E0217 14:32:55.184770 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348\": container with ID starting with 2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348 not found: ID does not exist" containerID="2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.184790 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348"} err="failed to get container status \"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348\": rpc error: code = NotFound desc = could not find container \"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348\": container with ID starting with 2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348 not found: ID does not exist" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.184803 4762 scope.go:117] "RemoveContainer" containerID="7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.185029 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511"} err="failed to get container status \"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511\": rpc error: code = NotFound desc = could not find container \"7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511\": container with ID starting with 7227f4b80ac34f78fb03168951cf09a588cc3f66eb56d9bb8d6f97303c9c4511 not found: ID does not exist" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.185069 4762 scope.go:117] "RemoveContainer" containerID="2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.185557 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348"} err="failed to get container status \"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348\": rpc error: code = NotFound desc = could not find container \"2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348\": container with ID starting with 2eed9bdbad0e2328638b6f429480b0a90ace73bdbd6579d268c7233fc10ec348 not found: ID does not exist" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.519567 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.534058 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.547548 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:55 crc kubenswrapper[4762]: E0217 14:32:55.548043 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5722df-f962-403c-abfa-793bc821be57" containerName="mariadb-account-create-update" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548072 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5722df-f962-403c-abfa-793bc821be57" containerName="mariadb-account-create-update" Feb 17 14:32:55 crc kubenswrapper[4762]: E0217 14:32:55.548109 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" containerName="mariadb-database-create" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548116 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" containerName="mariadb-database-create" Feb 17 14:32:55 crc kubenswrapper[4762]: E0217 14:32:55.548145 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-metadata" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548151 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-metadata" Feb 17 14:32:55 crc kubenswrapper[4762]: E0217 14:32:55.548164 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-log" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548170 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-log" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548380 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-metadata" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548403 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" containerName="mariadb-database-create" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548419 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" containerName="nova-metadata-log" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.548431 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5722df-f962-403c-abfa-793bc821be57" containerName="mariadb-account-create-update" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.549689 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.561957 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.562159 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.564792 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rpb\" (UniqueName: \"kubernetes.io/projected/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-kube-api-access-c4rpb\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.564843 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-logs\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.564893 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.564924 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-config-data\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.564951 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.578128 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.667109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-config-data\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.667186 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.667432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rpb\" (UniqueName: \"kubernetes.io/projected/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-kube-api-access-c4rpb\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.667485 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-logs\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.667543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.668020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-logs\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.672506 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.672826 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-config-data\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.679347 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.703045 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rpb\" (UniqueName: \"kubernetes.io/projected/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-kube-api-access-c4rpb\") pod \"nova-metadata-0\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " pod="openstack/nova-metadata-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.742740 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 14:32:55 crc kubenswrapper[4762]: I0217 14:32:55.742969 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.017982 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.244:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.022984 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.031834 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.060156 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.244:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.073526 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:32:56 crc kubenswrapper[4762]: E0217 14:32:56.074326 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.109455 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e23dcf-8e71-4876-b67a-7649e342a8f2" path="/var/lib/kubelet/pods/c3e23dcf-8e71-4876-b67a-7649e342a8f2/volumes" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.162887 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.164136 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.317613 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-wntzm"] Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.378153 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.815185 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-fgpcm"] Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.817867 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.826127 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.826217 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.826475 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xczfd" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.831079 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.848042 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fgpcm"] Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.994408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-scripts\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.994522 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-config-data\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.994588 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-combined-ca-bundle\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:56 crc kubenswrapper[4762]: I0217 14:32:56.994616 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phnll\" (UniqueName: \"kubernetes.io/projected/82cbcf38-171c-4676-988f-a742b4277bb6-kube-api-access-phnll\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.094012 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.097442 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-scripts\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.098259 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-config-data\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.098406 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-combined-ca-bundle\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.098483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phnll\" (UniqueName: \"kubernetes.io/projected/82cbcf38-171c-4676-988f-a742b4277bb6-kube-api-access-phnll\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.105863 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="dnsmasq-dns" containerID="cri-o://4f18091437fbcbef71845fdabfa8e7449abbec763e140344c9ad8714c7304977" gracePeriod=10 Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.109924 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-scripts\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.117480 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-combined-ca-bundle\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.118949 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-config-data\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.137579 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phnll\" (UniqueName: \"kubernetes.io/projected/82cbcf38-171c-4676-988f-a742b4277bb6-kube-api-access-phnll\") pod \"aodh-db-sync-fgpcm\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:57 crc kubenswrapper[4762]: I0217 14:32:57.159308 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:32:58 crc kubenswrapper[4762]: I0217 14:32:58.267896 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.219:5353: connect: connection refused" Feb 17 14:32:58 crc kubenswrapper[4762]: I0217 14:32:58.343700 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31683ab9-e5fb-43f0-9e27-6e5b86c3e027","Type":"ContainerStarted","Data":"e2a32d1d313005911910a51d346bc7df12d6bd34cadf269d6eb4c1883ffb6ca0"} Feb 17 14:32:58 crc kubenswrapper[4762]: I0217 14:32:58.343744 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31683ab9-e5fb-43f0-9e27-6e5b86c3e027","Type":"ContainerStarted","Data":"68919d49475d3adf57a818c75ed4521cdd30f29be0a9151bb4582cddf1fef5b5"} Feb 17 14:32:58 crc kubenswrapper[4762]: I0217 14:32:58.345778 4762 generic.go:334] "Generic (PLEG): container finished" podID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerID="4f18091437fbcbef71845fdabfa8e7449abbec763e140344c9ad8714c7304977" exitCode=0 Feb 17 14:32:58 crc kubenswrapper[4762]: I0217 14:32:58.345813 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" event={"ID":"7f033533-f8f8-4196-9fdd-31a14b0f019d","Type":"ContainerDied","Data":"4f18091437fbcbef71845fdabfa8e7449abbec763e140344c9ad8714c7304977"} Feb 17 14:32:58 crc kubenswrapper[4762]: E0217 14:32:58.949169 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f033533_f8f8_4196_9fdd_31a14b0f019d.slice/crio-4f18091437fbcbef71845fdabfa8e7449abbec763e140344c9ad8714c7304977.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.087277 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.145509 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw6dj\" (UniqueName: \"kubernetes.io/projected/7f033533-f8f8-4196-9fdd-31a14b0f019d-kube-api-access-mw6dj\") pod \"7f033533-f8f8-4196-9fdd-31a14b0f019d\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.145619 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-nb\") pod \"7f033533-f8f8-4196-9fdd-31a14b0f019d\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.145710 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-config\") pod \"7f033533-f8f8-4196-9fdd-31a14b0f019d\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.145777 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-sb\") pod \"7f033533-f8f8-4196-9fdd-31a14b0f019d\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.145914 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-swift-storage-0\") pod \"7f033533-f8f8-4196-9fdd-31a14b0f019d\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.146061 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-svc\") pod \"7f033533-f8f8-4196-9fdd-31a14b0f019d\" (UID: \"7f033533-f8f8-4196-9fdd-31a14b0f019d\") " Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.151045 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fgpcm"] Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.154554 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f033533-f8f8-4196-9fdd-31a14b0f019d-kube-api-access-mw6dj" (OuterVolumeSpecName: "kube-api-access-mw6dj") pod "7f033533-f8f8-4196-9fdd-31a14b0f019d" (UID: "7f033533-f8f8-4196-9fdd-31a14b0f019d"). InnerVolumeSpecName "kube-api-access-mw6dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.230148 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-config" (OuterVolumeSpecName: "config") pod "7f033533-f8f8-4196-9fdd-31a14b0f019d" (UID: "7f033533-f8f8-4196-9fdd-31a14b0f019d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.243324 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f033533-f8f8-4196-9fdd-31a14b0f019d" (UID: "7f033533-f8f8-4196-9fdd-31a14b0f019d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.244988 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7f033533-f8f8-4196-9fdd-31a14b0f019d" (UID: "7f033533-f8f8-4196-9fdd-31a14b0f019d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.252459 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.252495 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw6dj\" (UniqueName: \"kubernetes.io/projected/7f033533-f8f8-4196-9fdd-31a14b0f019d-kube-api-access-mw6dj\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.252506 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.252515 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.262457 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f033533-f8f8-4196-9fdd-31a14b0f019d" (UID: "7f033533-f8f8-4196-9fdd-31a14b0f019d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.267209 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f033533-f8f8-4196-9fdd-31a14b0f019d" (UID: "7f033533-f8f8-4196-9fdd-31a14b0f019d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.355291 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.355338 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f033533-f8f8-4196-9fdd-31a14b0f019d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.360978 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31683ab9-e5fb-43f0-9e27-6e5b86c3e027","Type":"ContainerStarted","Data":"b14a8ac2dd8b67ef5d9e92a1abe7bc5dcaa8568683fcb2eff6bec4554fb1e657"} Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.364951 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.365500 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-wntzm" event={"ID":"7f033533-f8f8-4196-9fdd-31a14b0f019d","Type":"ContainerDied","Data":"da8f2182c8d9b8762d3460dfcded9af6ff36eb8838370579dd722e5bcb95a16d"} Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.365565 4762 scope.go:117] "RemoveContainer" containerID="4f18091437fbcbef71845fdabfa8e7449abbec763e140344c9ad8714c7304977" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.367578 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fgpcm" event={"ID":"82cbcf38-171c-4676-988f-a742b4277bb6","Type":"ContainerStarted","Data":"36581f4c09232f28614fef9e187c4652899e062f400ffce3aa4999e8ba6b1519"} Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.397602 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.397580781 podStartE2EDuration="4.397580781s" podCreationTimestamp="2026-02-17 14:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:59.391400784 +0000 UTC m=+1659.971401446" watchObservedRunningTime="2026-02-17 14:32:59.397580781 +0000 UTC m=+1659.977581433" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.451477 4762 scope.go:117] "RemoveContainer" containerID="e4953faad0e578de9b5623a5cfa350b5b1615f2951a2f3335e22b610c29c27a2" Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.454339 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-wntzm"] Feb 17 14:32:59 crc kubenswrapper[4762]: I0217 14:32:59.473198 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-wntzm"] Feb 17 14:33:00 crc kubenswrapper[4762]: I0217 14:33:00.098861 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" path="/var/lib/kubelet/pods/7f033533-f8f8-4196-9fdd-31a14b0f019d/volumes" Feb 17 14:33:01 crc kubenswrapper[4762]: I0217 14:33:01.024466 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:33:01 crc kubenswrapper[4762]: I0217 14:33:01.025556 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:33:01 crc kubenswrapper[4762]: I0217 14:33:01.134188 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:33:01 crc kubenswrapper[4762]: I0217 14:33:01.134264 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:33:02 crc kubenswrapper[4762]: I0217 14:33:02.205132 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pxsl2" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="registry-server" probeResult="failure" output=< Feb 17 14:33:02 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:33:02 crc kubenswrapper[4762]: > Feb 17 14:33:02 crc kubenswrapper[4762]: I0217 14:33:02.483692 4762 generic.go:334] "Generic (PLEG): container finished" podID="a4589d86-754e-46ec-bd8f-412abdf21890" containerID="0a7db91915ffc089979e848f81e2557ee1f9543eceec4a23d5f5ea6017f3e657" exitCode=0 Feb 17 14:33:02 crc kubenswrapper[4762]: I0217 14:33:02.484776 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wdbb8" event={"ID":"a4589d86-754e-46ec-bd8f-412abdf21890","Type":"ContainerDied","Data":"0a7db91915ffc089979e848f81e2557ee1f9543eceec4a23d5f5ea6017f3e657"} Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.445974 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.541629 4762 generic.go:334] "Generic (PLEG): container finished" podID="5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" containerID="745b57e6bf2efa1b71aa23513113a2fb00baba1fc7cb99b978eda5e9db9a2354" exitCode=0 Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.541708 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" event={"ID":"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d","Type":"ContainerDied","Data":"745b57e6bf2efa1b71aa23513113a2fb00baba1fc7cb99b978eda5e9db9a2354"} Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.545794 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wdbb8" event={"ID":"a4589d86-754e-46ec-bd8f-412abdf21890","Type":"ContainerDied","Data":"5e6a6cc960de7e807aea06e474caf0188d0711267cba6796ceb1ff821e24407c"} Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.545846 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e6a6cc960de7e807aea06e474caf0188d0711267cba6796ceb1ff821e24407c" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.545896 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wdbb8" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.553566 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-scripts\") pod \"a4589d86-754e-46ec-bd8f-412abdf21890\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.553683 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-config-data\") pod \"a4589d86-754e-46ec-bd8f-412abdf21890\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.553870 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-combined-ca-bundle\") pod \"a4589d86-754e-46ec-bd8f-412abdf21890\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.554130 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj4p9\" (UniqueName: \"kubernetes.io/projected/a4589d86-754e-46ec-bd8f-412abdf21890-kube-api-access-xj4p9\") pod \"a4589d86-754e-46ec-bd8f-412abdf21890\" (UID: \"a4589d86-754e-46ec-bd8f-412abdf21890\") " Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.562403 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-scripts" (OuterVolumeSpecName: "scripts") pod "a4589d86-754e-46ec-bd8f-412abdf21890" (UID: "a4589d86-754e-46ec-bd8f-412abdf21890"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.570222 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4589d86-754e-46ec-bd8f-412abdf21890-kube-api-access-xj4p9" (OuterVolumeSpecName: "kube-api-access-xj4p9") pod "a4589d86-754e-46ec-bd8f-412abdf21890" (UID: "a4589d86-754e-46ec-bd8f-412abdf21890"). InnerVolumeSpecName "kube-api-access-xj4p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.606286 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-config-data" (OuterVolumeSpecName: "config-data") pod "a4589d86-754e-46ec-bd8f-412abdf21890" (UID: "a4589d86-754e-46ec-bd8f-412abdf21890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.651850 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4589d86-754e-46ec-bd8f-412abdf21890" (UID: "a4589d86-754e-46ec-bd8f-412abdf21890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.658009 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.658060 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj4p9\" (UniqueName: \"kubernetes.io/projected/a4589d86-754e-46ec-bd8f-412abdf21890-kube-api-access-xj4p9\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.658079 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.658095 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4589d86-754e-46ec-bd8f-412abdf21890-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.709433 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.709757 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-log" containerID="cri-o://cd83dad5e360685ebc38eca2aca36eb53edbcf6f534129f8b4cc39e91add98cf" gracePeriod=30 Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.709904 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-api" containerID="cri-o://dd1d61e4395f1ee047a795522118292aa07dc39bf280cbf996e58279b1113a81" gracePeriod=30 Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.745787 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.746081 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" containerName="nova-scheduler-scheduler" containerID="cri-o://03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" gracePeriod=30 Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.775822 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.776394 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-log" containerID="cri-o://e2a32d1d313005911910a51d346bc7df12d6bd34cadf269d6eb4c1883ffb6ca0" gracePeriod=30 Feb 17 14:33:04 crc kubenswrapper[4762]: I0217 14:33:04.776570 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-metadata" containerID="cri-o://b14a8ac2dd8b67ef5d9e92a1abe7bc5dcaa8568683fcb2eff6bec4554fb1e657" gracePeriod=30 Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.573743 4762 generic.go:334] "Generic (PLEG): container finished" podID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerID="f6ed86882b8a6fc97ef15682de3e38aa93b3d6ba89042608649ec488ff9de44b" exitCode=0 Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.573823 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerDied","Data":"f6ed86882b8a6fc97ef15682de3e38aa93b3d6ba89042608649ec488ff9de44b"} Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.578067 4762 generic.go:334] "Generic (PLEG): container finished" podID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerID="cd83dad5e360685ebc38eca2aca36eb53edbcf6f534129f8b4cc39e91add98cf" exitCode=143 Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.578150 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ebdcf8-a028-49e2-b555-6505f8b0765a","Type":"ContainerDied","Data":"cd83dad5e360685ebc38eca2aca36eb53edbcf6f534129f8b4cc39e91add98cf"} Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.581122 4762 generic.go:334] "Generic (PLEG): container finished" podID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerID="b14a8ac2dd8b67ef5d9e92a1abe7bc5dcaa8568683fcb2eff6bec4554fb1e657" exitCode=0 Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.581145 4762 generic.go:334] "Generic (PLEG): container finished" podID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerID="e2a32d1d313005911910a51d346bc7df12d6bd34cadf269d6eb4c1883ffb6ca0" exitCode=143 Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.581173 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31683ab9-e5fb-43f0-9e27-6e5b86c3e027","Type":"ContainerDied","Data":"b14a8ac2dd8b67ef5d9e92a1abe7bc5dcaa8568683fcb2eff6bec4554fb1e657"} Feb 17 14:33:05 crc kubenswrapper[4762]: I0217 14:33:05.581208 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31683ab9-e5fb-43f0-9e27-6e5b86c3e027","Type":"ContainerDied","Data":"e2a32d1d313005911910a51d346bc7df12d6bd34cadf269d6eb4c1883ffb6ca0"} Feb 17 14:33:05 crc kubenswrapper[4762]: E0217 14:33:05.755718 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:33:05 crc kubenswrapper[4762]: E0217 14:33:05.759063 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:33:05 crc kubenswrapper[4762]: E0217 14:33:05.762605 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:33:05 crc kubenswrapper[4762]: E0217 14:33:05.762715 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" containerName="nova-scheduler-scheduler" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.004027 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.084747 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.104309 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-config-data\") pod \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.104676 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-logs\") pod \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.104842 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-combined-ca-bundle\") pod \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.104974 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-nova-metadata-tls-certs\") pod \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.105128 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4rpb\" (UniqueName: \"kubernetes.io/projected/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-kube-api-access-c4rpb\") pod \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\" (UID: \"31683ab9-e5fb-43f0-9e27-6e5b86c3e027\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.105613 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-logs" (OuterVolumeSpecName: "logs") pod "31683ab9-e5fb-43f0-9e27-6e5b86c3e027" (UID: "31683ab9-e5fb-43f0-9e27-6e5b86c3e027"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.106502 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.117034 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-kube-api-access-c4rpb" (OuterVolumeSpecName: "kube-api-access-c4rpb") pod "31683ab9-e5fb-43f0-9e27-6e5b86c3e027" (UID: "31683ab9-e5fb-43f0-9e27-6e5b86c3e027"). InnerVolumeSpecName "kube-api-access-c4rpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.140837 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-config-data" (OuterVolumeSpecName: "config-data") pod "31683ab9-e5fb-43f0-9e27-6e5b86c3e027" (UID: "31683ab9-e5fb-43f0-9e27-6e5b86c3e027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.153787 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31683ab9-e5fb-43f0-9e27-6e5b86c3e027" (UID: "31683ab9-e5fb-43f0-9e27-6e5b86c3e027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.187868 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "31683ab9-e5fb-43f0-9e27-6e5b86c3e027" (UID: "31683ab9-e5fb-43f0-9e27-6e5b86c3e027"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.208506 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-combined-ca-bundle\") pod \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.208587 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-scripts\") pod \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.208706 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckjqt\" (UniqueName: \"kubernetes.io/projected/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-kube-api-access-ckjqt\") pod \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.208762 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data\") pod \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.210163 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.210184 4762 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.210198 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4rpb\" (UniqueName: \"kubernetes.io/projected/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-kube-api-access-c4rpb\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.210207 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31683ab9-e5fb-43f0-9e27-6e5b86c3e027-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.213380 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-scripts" (OuterVolumeSpecName: "scripts") pod "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" (UID: "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.214165 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-kube-api-access-ckjqt" (OuterVolumeSpecName: "kube-api-access-ckjqt") pod "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" (UID: "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d"). InnerVolumeSpecName "kube-api-access-ckjqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.238085 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data podName:5ae10efe-5821-4182-8f8b-bd9c6cc13a4d nodeName:}" failed. No retries permitted until 2026-02-17 14:33:06.738049823 +0000 UTC m=+1667.318050475 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data") pod "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" (UID: "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d") : error deleting /var/lib/kubelet/pods/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d/volume-subpaths: remove /var/lib/kubelet/pods/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d/volume-subpaths: no such file or directory Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.240470 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" (UID: "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.313763 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.313804 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.313820 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckjqt\" (UniqueName: \"kubernetes.io/projected/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-kube-api-access-ckjqt\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.611468 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.628893 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.628903 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31683ab9-e5fb-43f0-9e27-6e5b86c3e027","Type":"ContainerDied","Data":"68919d49475d3adf57a818c75ed4521cdd30f29be0a9151bb4582cddf1fef5b5"} Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.628952 4762 scope.go:117] "RemoveContainer" containerID="b14a8ac2dd8b67ef5d9e92a1abe7bc5dcaa8568683fcb2eff6bec4554fb1e657" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.637222 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" event={"ID":"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d","Type":"ContainerDied","Data":"48bb7bc152b68943f9f2875120b6c7c8a7a3a8183af10a16fa848d6a559b8f32"} Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.637250 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48bb7bc152b68943f9f2875120b6c7c8a7a3a8183af10a16fa848d6a559b8f32" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.637415 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9zsnn" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.639966 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fgpcm" event={"ID":"82cbcf38-171c-4676-988f-a742b4277bb6","Type":"ContainerStarted","Data":"561cbb4ba0f490708913ac6ccd73f550bfd7b006b2b4821a8959f193b20c40bb"} Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.659477 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-fgpcm" podStartSLOduration=4.099070003 podStartE2EDuration="10.659451658s" podCreationTimestamp="2026-02-17 14:32:56 +0000 UTC" firstStartedPulling="2026-02-17 14:32:59.13840041 +0000 UTC m=+1659.718401062" lastFinishedPulling="2026-02-17 14:33:05.698782075 +0000 UTC m=+1666.278782717" observedRunningTime="2026-02-17 14:33:06.654240027 +0000 UTC m=+1667.234240689" watchObservedRunningTime="2026-02-17 14:33:06.659451658 +0000 UTC m=+1667.239452310" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.664294 4762 scope.go:117] "RemoveContainer" containerID="e2a32d1d313005911910a51d346bc7df12d6bd34cadf269d6eb4c1883ffb6ca0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.731163 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.731960 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" containerName="nova-cell1-conductor-db-sync" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.731984 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" containerName="nova-cell1-conductor-db-sync" Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.732017 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="dnsmasq-dns" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732034 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="dnsmasq-dns" Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.732047 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4589d86-754e-46ec-bd8f-412abdf21890" containerName="nova-manage" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732055 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4589d86-754e-46ec-bd8f-412abdf21890" containerName="nova-manage" Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.732073 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-metadata" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732081 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-metadata" Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.732106 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-log" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732115 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-log" Feb 17 14:33:06 crc kubenswrapper[4762]: E0217 14:33:06.732152 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="init" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732162 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="init" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732459 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" containerName="nova-cell1-conductor-db-sync" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732484 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4589d86-754e-46ec-bd8f-412abdf21890" containerName="nova-manage" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732495 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-metadata" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732507 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f033533-f8f8-4196-9fdd-31a14b0f019d" containerName="dnsmasq-dns" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.732534 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" containerName="nova-metadata-log" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.733682 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.758335 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.777985 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.789695 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.802180 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.804906 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.809217 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.809317 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.817057 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.825201 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data\") pod \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\" (UID: \"5ae10efe-5821-4182-8f8b-bd9c6cc13a4d\") " Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.826128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qgr\" (UniqueName: \"kubernetes.io/projected/c779d9da-d7c8-4829-b255-a1f4749f0fbe-kube-api-access-n5qgr\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.831512 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data" (OuterVolumeSpecName: "config-data") pod "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" (UID: "5ae10efe-5821-4182-8f8b-bd9c6cc13a4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.833541 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c779d9da-d7c8-4829-b255-a1f4749f0fbe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.833867 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c779d9da-d7c8-4829-b255-a1f4749f0fbe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.834250 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.936204 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-config-data\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.936499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c779d9da-d7c8-4829-b255-a1f4749f0fbe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.936699 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb9f998-3134-4e4b-91ee-6ee679264798-logs\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.936841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c779d9da-d7c8-4829-b255-a1f4749f0fbe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.937013 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.937127 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.937223 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qgr\" (UniqueName: \"kubernetes.io/projected/c779d9da-d7c8-4829-b255-a1f4749f0fbe-kube-api-access-n5qgr\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.937362 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qxbr\" (UniqueName: \"kubernetes.io/projected/6bb9f998-3134-4e4b-91ee-6ee679264798-kube-api-access-4qxbr\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.944630 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c779d9da-d7c8-4829-b255-a1f4749f0fbe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.945626 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c779d9da-d7c8-4829-b255-a1f4749f0fbe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:06 crc kubenswrapper[4762]: I0217 14:33:06.958939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qgr\" (UniqueName: \"kubernetes.io/projected/c779d9da-d7c8-4829-b255-a1f4749f0fbe-kube-api-access-n5qgr\") pod \"nova-cell1-conductor-0\" (UID: \"c779d9da-d7c8-4829-b255-a1f4749f0fbe\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.040907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.040982 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.041109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qxbr\" (UniqueName: \"kubernetes.io/projected/6bb9f998-3134-4e4b-91ee-6ee679264798-kube-api-access-4qxbr\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.041259 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-config-data\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.041364 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb9f998-3134-4e4b-91ee-6ee679264798-logs\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.041893 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb9f998-3134-4e4b-91ee-6ee679264798-logs\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.044425 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.044437 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.051281 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-config-data\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.062523 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.062825 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qxbr\" (UniqueName: \"kubernetes.io/projected/6bb9f998-3134-4e4b-91ee-6ee679264798-kube-api-access-4qxbr\") pod \"nova-metadata-0\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.072175 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:33:07 crc kubenswrapper[4762]: E0217 14:33:07.072405 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.127522 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.618267 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.644694 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:07 crc kubenswrapper[4762]: W0217 14:33:07.644911 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bb9f998_3134_4e4b_91ee_6ee679264798.slice/crio-0f059d172921ce2b383b4c866b68d8981d326ef4dfa08b7f3b63c0b7f9285426 WatchSource:0}: Error finding container 0f059d172921ce2b383b4c866b68d8981d326ef4dfa08b7f3b63c0b7f9285426: Status 404 returned error can't find the container with id 0f059d172921ce2b383b4c866b68d8981d326ef4dfa08b7f3b63c0b7f9285426 Feb 17 14:33:07 crc kubenswrapper[4762]: I0217 14:33:07.660764 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c779d9da-d7c8-4829-b255-a1f4749f0fbe","Type":"ContainerStarted","Data":"6ddb7a890ce1c950024b24f5948fb985886ca482770ec2cc499aab946123ad6a"} Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.093631 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31683ab9-e5fb-43f0-9e27-6e5b86c3e027" path="/var/lib/kubelet/pods/31683ab9-e5fb-43f0-9e27-6e5b86c3e027/volumes" Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.689449 4762 generic.go:334] "Generic (PLEG): container finished" podID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerID="dd1d61e4395f1ee047a795522118292aa07dc39bf280cbf996e58279b1113a81" exitCode=0 Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.689784 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ebdcf8-a028-49e2-b555-6505f8b0765a","Type":"ContainerDied","Data":"dd1d61e4395f1ee047a795522118292aa07dc39bf280cbf996e58279b1113a81"} Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.696052 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6bb9f998-3134-4e4b-91ee-6ee679264798","Type":"ContainerStarted","Data":"59773f5a9db93ad22b346d36f4b50875a85c9b2c4b699bcec80eb85aa725692e"} Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.696096 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6bb9f998-3134-4e4b-91ee-6ee679264798","Type":"ContainerStarted","Data":"e516c2d595f01a19af1b3b7531bf2bd3e4520e05d113cb97d33cbdbed416b182"} Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.696105 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6bb9f998-3134-4e4b-91ee-6ee679264798","Type":"ContainerStarted","Data":"0f059d172921ce2b383b4c866b68d8981d326ef4dfa08b7f3b63c0b7f9285426"} Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.700729 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c779d9da-d7c8-4829-b255-a1f4749f0fbe","Type":"ContainerStarted","Data":"aaf99821958cfb5d544a15661728aa44f6792ba860b703499b24ceda08db2551"} Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.701412 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.740103 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.740078881 podStartE2EDuration="2.740078881s" podCreationTimestamp="2026-02-17 14:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:08.726923935 +0000 UTC m=+1669.306924597" watchObservedRunningTime="2026-02-17 14:33:08.740078881 +0000 UTC m=+1669.320079533" Feb 17 14:33:08 crc kubenswrapper[4762]: I0217 14:33:08.768301 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.768272525 podStartE2EDuration="2.768272525s" podCreationTimestamp="2026-02-17 14:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:08.749126727 +0000 UTC m=+1669.329127379" watchObservedRunningTime="2026-02-17 14:33:08.768272525 +0000 UTC m=+1669.348273177" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.337944 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.405990 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ebdcf8-a028-49e2-b555-6505f8b0765a-logs\") pod \"95ebdcf8-a028-49e2-b555-6505f8b0765a\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.406129 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-combined-ca-bundle\") pod \"95ebdcf8-a028-49e2-b555-6505f8b0765a\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.406188 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-548t8\" (UniqueName: \"kubernetes.io/projected/95ebdcf8-a028-49e2-b555-6505f8b0765a-kube-api-access-548t8\") pod \"95ebdcf8-a028-49e2-b555-6505f8b0765a\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.406347 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-config-data\") pod \"95ebdcf8-a028-49e2-b555-6505f8b0765a\" (UID: \"95ebdcf8-a028-49e2-b555-6505f8b0765a\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.406617 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ebdcf8-a028-49e2-b555-6505f8b0765a-logs" (OuterVolumeSpecName: "logs") pod "95ebdcf8-a028-49e2-b555-6505f8b0765a" (UID: "95ebdcf8-a028-49e2-b555-6505f8b0765a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.407538 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ebdcf8-a028-49e2-b555-6505f8b0765a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.439907 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ebdcf8-a028-49e2-b555-6505f8b0765a-kube-api-access-548t8" (OuterVolumeSpecName: "kube-api-access-548t8") pod "95ebdcf8-a028-49e2-b555-6505f8b0765a" (UID: "95ebdcf8-a028-49e2-b555-6505f8b0765a"). InnerVolumeSpecName "kube-api-access-548t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.494814 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95ebdcf8-a028-49e2-b555-6505f8b0765a" (UID: "95ebdcf8-a028-49e2-b555-6505f8b0765a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.499211 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-config-data" (OuterVolumeSpecName: "config-data") pod "95ebdcf8-a028-49e2-b555-6505f8b0765a" (UID: "95ebdcf8-a028-49e2-b555-6505f8b0765a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.509890 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.510163 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-548t8\" (UniqueName: \"kubernetes.io/projected/95ebdcf8-a028-49e2-b555-6505f8b0765a-kube-api-access-548t8\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.510180 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ebdcf8-a028-49e2-b555-6505f8b0765a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.516906 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.611759 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-combined-ca-bundle\") pod \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.611842 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmbwd\" (UniqueName: \"kubernetes.io/projected/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-kube-api-access-rmbwd\") pod \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.612251 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-config-data\") pod \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\" (UID: \"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263\") " Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.631231 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-kube-api-access-rmbwd" (OuterVolumeSpecName: "kube-api-access-rmbwd") pod "3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" (UID: "3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263"). InnerVolumeSpecName "kube-api-access-rmbwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.652144 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-config-data" (OuterVolumeSpecName: "config-data") pod "3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" (UID: "3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.682439 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" (UID: "3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.716736 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.716783 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.716802 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmbwd\" (UniqueName: \"kubernetes.io/projected/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263-kube-api-access-rmbwd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.728547 4762 generic.go:334] "Generic (PLEG): container finished" podID="82cbcf38-171c-4676-988f-a742b4277bb6" containerID="561cbb4ba0f490708913ac6ccd73f550bfd7b006b2b4821a8959f193b20c40bb" exitCode=0 Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.728689 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fgpcm" event={"ID":"82cbcf38-171c-4676-988f-a742b4277bb6","Type":"ContainerDied","Data":"561cbb4ba0f490708913ac6ccd73f550bfd7b006b2b4821a8959f193b20c40bb"} Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.732579 4762 generic.go:334] "Generic (PLEG): container finished" podID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" exitCode=0 Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.732706 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.732766 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263","Type":"ContainerDied","Data":"03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e"} Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.732812 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263","Type":"ContainerDied","Data":"17d1956323d8484a803dc651c21c1bfca2808c75b401feb741d42f57fb0426dd"} Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.732836 4762 scope.go:117] "RemoveContainer" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.736685 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.736756 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ebdcf8-a028-49e2-b555-6505f8b0765a","Type":"ContainerDied","Data":"bdfaf0d66f4f4e9b5bb546474a6be765a09ae2bc62c10879ecdda7ba7e7e6620"} Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.794842 4762 scope.go:117] "RemoveContainer" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" Feb 17 14:33:09 crc kubenswrapper[4762]: E0217 14:33:09.795389 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e\": container with ID starting with 03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e not found: ID does not exist" containerID="03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.795468 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e"} err="failed to get container status \"03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e\": rpc error: code = NotFound desc = could not find container \"03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e\": container with ID starting with 03dfae66b5c1361a74551d88697d5917665bb9bcac16fb5222e9dfd07610420e not found: ID does not exist" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.795498 4762 scope.go:117] "RemoveContainer" containerID="dd1d61e4395f1ee047a795522118292aa07dc39bf280cbf996e58279b1113a81" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.808359 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.824861 4762 scope.go:117] "RemoveContainer" containerID="cd83dad5e360685ebc38eca2aca36eb53edbcf6f534129f8b4cc39e91add98cf" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.827188 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.839018 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.851095 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.869811 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: E0217 14:33:09.870590 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" containerName="nova-scheduler-scheduler" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.870621 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" containerName="nova-scheduler-scheduler" Feb 17 14:33:09 crc kubenswrapper[4762]: E0217 14:33:09.870676 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-log" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.870686 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-log" Feb 17 14:33:09 crc kubenswrapper[4762]: E0217 14:33:09.870707 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-api" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.870717 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-api" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.871016 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" containerName="nova-scheduler-scheduler" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.871050 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-api" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.871066 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" containerName="nova-api-log" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.872277 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.875714 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.908246 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.921843 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.924504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.927434 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:33:09 crc kubenswrapper[4762]: I0217 14:33:09.934074 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.024275 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-config-data\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.024399 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhs5\" (UniqueName: \"kubernetes.io/projected/dfef0bfb-1f5e-4c74-b451-624612d99d6f-kube-api-access-kfhs5\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.025509 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.090063 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263" path="/var/lib/kubelet/pods/3c68ed32-bb5f-40f0-9cd5-5ac1d13fa263/volumes" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.091352 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ebdcf8-a028-49e2-b555-6505f8b0765a" path="/var/lib/kubelet/pods/95ebdcf8-a028-49e2-b555-6505f8b0765a/volumes" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.127810 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12165630-4428-4b61-a595-eec93ce5938d-logs\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.127907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.128049 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-config-data\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.128096 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-config-data\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.128123 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.128153 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgm5b\" (UniqueName: \"kubernetes.io/projected/12165630-4428-4b61-a595-eec93ce5938d-kube-api-access-pgm5b\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.128258 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhs5\" (UniqueName: \"kubernetes.io/projected/dfef0bfb-1f5e-4c74-b451-624612d99d6f-kube-api-access-kfhs5\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.133208 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.146398 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-config-data\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.151213 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhs5\" (UniqueName: \"kubernetes.io/projected/dfef0bfb-1f5e-4c74-b451-624612d99d6f-kube-api-access-kfhs5\") pod \"nova-scheduler-0\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.207893 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.230145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-config-data\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.230527 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.230564 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgm5b\" (UniqueName: \"kubernetes.io/projected/12165630-4428-4b61-a595-eec93ce5938d-kube-api-access-pgm5b\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.230786 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12165630-4428-4b61-a595-eec93ce5938d-logs\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.231559 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12165630-4428-4b61-a595-eec93ce5938d-logs\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.234852 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.235955 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-config-data\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.254726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgm5b\" (UniqueName: \"kubernetes.io/projected/12165630-4428-4b61-a595-eec93ce5938d-kube-api-access-pgm5b\") pod \"nova-api-0\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.260963 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.761926 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:10 crc kubenswrapper[4762]: I0217 14:33:10.905901 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.165840 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.366011 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-scripts\") pod \"82cbcf38-171c-4676-988f-a742b4277bb6\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.366419 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phnll\" (UniqueName: \"kubernetes.io/projected/82cbcf38-171c-4676-988f-a742b4277bb6-kube-api-access-phnll\") pod \"82cbcf38-171c-4676-988f-a742b4277bb6\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.366453 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-config-data\") pod \"82cbcf38-171c-4676-988f-a742b4277bb6\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.366474 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-combined-ca-bundle\") pod \"82cbcf38-171c-4676-988f-a742b4277bb6\" (UID: \"82cbcf38-171c-4676-988f-a742b4277bb6\") " Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.372341 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-scripts" (OuterVolumeSpecName: "scripts") pod "82cbcf38-171c-4676-988f-a742b4277bb6" (UID: "82cbcf38-171c-4676-988f-a742b4277bb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.372416 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82cbcf38-171c-4676-988f-a742b4277bb6-kube-api-access-phnll" (OuterVolumeSpecName: "kube-api-access-phnll") pod "82cbcf38-171c-4676-988f-a742b4277bb6" (UID: "82cbcf38-171c-4676-988f-a742b4277bb6"). InnerVolumeSpecName "kube-api-access-phnll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.407692 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-config-data" (OuterVolumeSpecName: "config-data") pod "82cbcf38-171c-4676-988f-a742b4277bb6" (UID: "82cbcf38-171c-4676-988f-a742b4277bb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.411265 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82cbcf38-171c-4676-988f-a742b4277bb6" (UID: "82cbcf38-171c-4676-988f-a742b4277bb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.470380 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phnll\" (UniqueName: \"kubernetes.io/projected/82cbcf38-171c-4676-988f-a742b4277bb6-kube-api-access-phnll\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.470426 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.470440 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.470453 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82cbcf38-171c-4676-988f-a742b4277bb6-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.805616 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfef0bfb-1f5e-4c74-b451-624612d99d6f","Type":"ContainerStarted","Data":"00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443"} Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.805700 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfef0bfb-1f5e-4c74-b451-624612d99d6f","Type":"ContainerStarted","Data":"846e418f8241e923cd0905f061b4652273d506eb4374b354b28a2585ec2c0ea2"} Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.808752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fgpcm" event={"ID":"82cbcf38-171c-4676-988f-a742b4277bb6","Type":"ContainerDied","Data":"36581f4c09232f28614fef9e187c4652899e062f400ffce3aa4999e8ba6b1519"} Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.808804 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36581f4c09232f28614fef9e187c4652899e062f400ffce3aa4999e8ba6b1519" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.808769 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fgpcm" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.811171 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12165630-4428-4b61-a595-eec93ce5938d","Type":"ContainerStarted","Data":"4f14bc89881c6bbb12f27ae85df3b3fea2f73bd13540648b9480b82879ea3abd"} Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.811212 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12165630-4428-4b61-a595-eec93ce5938d","Type":"ContainerStarted","Data":"74fb564dbde7810e1263c381fdd6bf91af1b9ea2163631f38629d22afc8d3097"} Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.811228 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12165630-4428-4b61-a595-eec93ce5938d","Type":"ContainerStarted","Data":"f4cbba9bde8c54b8ec212f2e279f6679cc2af812e3ed91ddbcfd203cac5396ff"} Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.832946 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8329220939999997 podStartE2EDuration="2.832922094s" podCreationTimestamp="2026-02-17 14:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:11.826949902 +0000 UTC m=+1672.406950554" watchObservedRunningTime="2026-02-17 14:33:11.832922094 +0000 UTC m=+1672.412922746" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.854583 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.85456063 podStartE2EDuration="2.85456063s" podCreationTimestamp="2026-02-17 14:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:11.852299918 +0000 UTC m=+1672.432300570" watchObservedRunningTime="2026-02-17 14:33:11.85456063 +0000 UTC m=+1672.434561282" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.950939 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:11 crc kubenswrapper[4762]: E0217 14:33:11.951924 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82cbcf38-171c-4676-988f-a742b4277bb6" containerName="aodh-db-sync" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.951946 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cbcf38-171c-4676-988f-a742b4277bb6" containerName="aodh-db-sync" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.952177 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="82cbcf38-171c-4676-988f-a742b4277bb6" containerName="aodh-db-sync" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.954727 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.960950 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.961550 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xczfd" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.961877 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 17 14:33:11 crc kubenswrapper[4762]: I0217 14:33:11.980240 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.089566 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.089828 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-config-data\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.090430 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bswdq\" (UniqueName: \"kubernetes.io/projected/78331bd2-6f9d-4613-ac62-672c89a6ea1b-kube-api-access-bswdq\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.090567 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-scripts\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.128524 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.129963 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.192162 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bswdq\" (UniqueName: \"kubernetes.io/projected/78331bd2-6f9d-4613-ac62-672c89a6ea1b-kube-api-access-bswdq\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.192711 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-scripts\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.192842 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.192882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-config-data\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.204966 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.208986 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-config-data\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.210794 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-scripts\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.212996 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bswdq\" (UniqueName: \"kubernetes.io/projected/78331bd2-6f9d-4613-ac62-672c89a6ea1b-kube-api-access-bswdq\") pod \"aodh-0\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.218083 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pxsl2" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="registry-server" probeResult="failure" output=< Feb 17 14:33:12 crc kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Feb 17 14:33:12 crc kubenswrapper[4762]: > Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.296132 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.861695 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:33:12 crc kubenswrapper[4762]: I0217 14:33:12.866612 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:13 crc kubenswrapper[4762]: I0217 14:33:13.849169 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerStarted","Data":"00d51b9c5984bae0b8d24c37dc4dfe0832ed42d5c56b0c95a5ee8bd82342d8e1"} Feb 17 14:33:13 crc kubenswrapper[4762]: I0217 14:33:13.849811 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerStarted","Data":"9308474c37c5a8ca5541eac43b1e7794910c70dd81935ec11b5856cfd5055da9"} Feb 17 14:33:15 crc kubenswrapper[4762]: I0217 14:33:15.142712 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:15 crc kubenswrapper[4762]: I0217 14:33:15.209070 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 14:33:15 crc kubenswrapper[4762]: I0217 14:33:15.877764 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerStarted","Data":"bfaa8ab977c5183d16ee6888a20a0627d6e28f847b814137b81448edd19e2403"} Feb 17 14:33:16 crc kubenswrapper[4762]: I0217 14:33:16.940918 4762 generic.go:334] "Generic (PLEG): container finished" podID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerID="67387f4a707dde3c0a45f58e23b87997dccd841113d7e155a74b27c87b083720" exitCode=137 Feb 17 14:33:16 crc kubenswrapper[4762]: I0217 14:33:16.941221 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerDied","Data":"67387f4a707dde3c0a45f58e23b87997dccd841113d7e155a74b27c87b083720"} Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.128356 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.128801 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.168860 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.330398 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441120 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-sg-core-conf-yaml\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441242 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-config-data\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441322 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljdhx\" (UniqueName: \"kubernetes.io/projected/d485c47e-bce9-40a7-8a87-4b337f908b48-kube-api-access-ljdhx\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441403 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-run-httpd\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441423 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-scripts\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441480 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-log-httpd\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.441500 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-combined-ca-bundle\") pod \"d485c47e-bce9-40a7-8a87-4b337f908b48\" (UID: \"d485c47e-bce9-40a7-8a87-4b337f908b48\") " Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.442582 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.442617 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.450094 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-scripts" (OuterVolumeSpecName: "scripts") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.450152 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d485c47e-bce9-40a7-8a87-4b337f908b48-kube-api-access-ljdhx" (OuterVolumeSpecName: "kube-api-access-ljdhx") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "kube-api-access-ljdhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.489893 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.549266 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljdhx\" (UniqueName: \"kubernetes.io/projected/d485c47e-bce9-40a7-8a87-4b337f908b48-kube-api-access-ljdhx\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.549310 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.549324 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.549336 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d485c47e-bce9-40a7-8a87-4b337f908b48-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.549346 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.587828 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.613948 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-config-data" (OuterVolumeSpecName: "config-data") pod "d485c47e-bce9-40a7-8a87-4b337f908b48" (UID: "d485c47e-bce9-40a7-8a87-4b337f908b48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.651366 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.651403 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d485c47e-bce9-40a7-8a87-4b337f908b48-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.962041 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerStarted","Data":"dc6c4a8ff8fae25315b467372ae51fb8f33bb19f086ced32b0839a20fe2f12e2"} Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.968061 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d485c47e-bce9-40a7-8a87-4b337f908b48","Type":"ContainerDied","Data":"385f7ee76b29aeefcf94df508b106460b68ac231c2258f670aa35452bc572a81"} Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.968430 4762 scope.go:117] "RemoveContainer" containerID="67387f4a707dde3c0a45f58e23b87997dccd841113d7e155a74b27c87b083720" Feb 17 14:33:17 crc kubenswrapper[4762]: I0217 14:33:17.968197 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.010426 4762 scope.go:117] "RemoveContainer" containerID="e64f3111613e1c77e6b75a922b272b144d351f2b7b739fac7dde6366b2ec1344" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.021013 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.065286 4762 scope.go:117] "RemoveContainer" containerID="1959b3497a489f7f2471031234df2e8f3d9f1f74c04b832f1f4889c159828db8" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.065462 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.072028 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:33:18 crc kubenswrapper[4762]: E0217 14:33:18.072729 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.092777 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" path="/var/lib/kubelet/pods/d485c47e-bce9-40a7-8a87-4b337f908b48/volumes" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.101517 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:18 crc kubenswrapper[4762]: E0217 14:33:18.102022 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="proxy-httpd" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102045 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="proxy-httpd" Feb 17 14:33:18 crc kubenswrapper[4762]: E0217 14:33:18.102078 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-central-agent" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102087 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-central-agent" Feb 17 14:33:18 crc kubenswrapper[4762]: E0217 14:33:18.102137 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-notification-agent" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102147 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-notification-agent" Feb 17 14:33:18 crc kubenswrapper[4762]: E0217 14:33:18.102193 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="sg-core" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102202 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="sg-core" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102510 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-notification-agent" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102542 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="ceilometer-central-agent" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102566 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="proxy-httpd" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.102576 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d485c47e-bce9-40a7-8a87-4b337f908b48" containerName="sg-core" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.109392 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.109529 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.124974 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.125211 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.139629 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.253:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.144808 4762 scope.go:117] "RemoveContainer" containerID="f6ed86882b8a6fc97ef15682de3e38aa93b3d6ba89042608649ec488ff9de44b" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.181800 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.253:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.268630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-config-data\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.268683 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-run-httpd\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.268753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-scripts\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.268778 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.268807 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-log-httpd\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.268830 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ws6\" (UniqueName: \"kubernetes.io/projected/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-kube-api-access-p4ws6\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.269130 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.371885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-scripts\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.372862 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.373198 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-log-httpd\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.373283 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ws6\" (UniqueName: \"kubernetes.io/projected/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-kube-api-access-p4ws6\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.373522 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.374038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-config-data\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.374052 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-log-httpd\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.374074 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-run-httpd\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.374489 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-run-httpd\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.378169 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-config-data\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.389166 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.389269 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-scripts\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.389417 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.393100 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ws6\" (UniqueName: \"kubernetes.io/projected/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-kube-api-access-p4ws6\") pod \"ceilometer-0\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " pod="openstack/ceilometer-0" Feb 17 14:33:18 crc kubenswrapper[4762]: I0217 14:33:18.453038 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:19 crc kubenswrapper[4762]: I0217 14:33:19.061488 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.039957 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerStarted","Data":"8335223652f38d6ac0fe517cc512661f0ed97b1507173ea0d7ec73c25a7848cb"} Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.047050 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerStarted","Data":"25a310a6ea3a249f1ba5708296333d9999bd5e8c7e0b857a6864ec336c79f102"} Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.047308 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-api" containerID="cri-o://00d51b9c5984bae0b8d24c37dc4dfe0832ed42d5c56b0c95a5ee8bd82342d8e1" gracePeriod=30 Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.047633 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-notifier" containerID="cri-o://dc6c4a8ff8fae25315b467372ae51fb8f33bb19f086ced32b0839a20fe2f12e2" gracePeriod=30 Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.047785 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-evaluator" containerID="cri-o://bfaa8ab977c5183d16ee6888a20a0627d6e28f847b814137b81448edd19e2403" gracePeriod=30 Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.047877 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-listener" containerID="cri-o://25a310a6ea3a249f1ba5708296333d9999bd5e8c7e0b857a6864ec336c79f102" gracePeriod=30 Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.096045 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.248435426 podStartE2EDuration="9.096024052s" podCreationTimestamp="2026-02-17 14:33:11 +0000 UTC" firstStartedPulling="2026-02-17 14:33:12.861387103 +0000 UTC m=+1673.441387775" lastFinishedPulling="2026-02-17 14:33:19.708975749 +0000 UTC m=+1680.288976401" observedRunningTime="2026-02-17 14:33:20.090275647 +0000 UTC m=+1680.670276299" watchObservedRunningTime="2026-02-17 14:33:20.096024052 +0000 UTC m=+1680.676024704" Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.211632 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.262273 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.262324 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:33:20 crc kubenswrapper[4762]: I0217 14:33:20.269002 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.059714 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerStarted","Data":"050870f762b8e02712ad722fc90022cfdff99cd3054b7165e45a68db06297785"} Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.063260 4762 generic.go:334] "Generic (PLEG): container finished" podID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerID="bfaa8ab977c5183d16ee6888a20a0627d6e28f847b814137b81448edd19e2403" exitCode=0 Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.063291 4762 generic.go:334] "Generic (PLEG): container finished" podID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerID="00d51b9c5984bae0b8d24c37dc4dfe0832ed42d5c56b0c95a5ee8bd82342d8e1" exitCode=0 Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.063304 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerDied","Data":"bfaa8ab977c5183d16ee6888a20a0627d6e28f847b814137b81448edd19e2403"} Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.063334 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerDied","Data":"00d51b9c5984bae0b8d24c37dc4dfe0832ed42d5c56b0c95a5ee8bd82342d8e1"} Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.094024 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.194429 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.260331 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.344838 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.255:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.345109 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.255:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:33:21 crc kubenswrapper[4762]: I0217 14:33:21.447362 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxsl2"] Feb 17 14:33:22 crc kubenswrapper[4762]: I0217 14:33:22.103298 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerStarted","Data":"dba79217483916a4ef79968592b5deebbda18e2c78a1f0d5009a7cb247a213ac"} Feb 17 14:33:22 crc kubenswrapper[4762]: I0217 14:33:22.103342 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerStarted","Data":"3fc17f561123ba0f7d6bfaf8be800de6b8947e1d6d2dd298963a7e6a8715d28c"} Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.119953 4762 generic.go:334] "Generic (PLEG): container finished" podID="d6333e0c-df36-41f4-9efa-f3b1c161fa9a" containerID="265976f262e9c2b001b72753aa8e69799c1f6e7118b1c455d40777e503ecc600" exitCode=137 Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.120373 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pxsl2" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="registry-server" containerID="cri-o://97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615" gracePeriod=2 Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.120118 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d6333e0c-df36-41f4-9efa-f3b1c161fa9a","Type":"ContainerDied","Data":"265976f262e9c2b001b72753aa8e69799c1f6e7118b1c455d40777e503ecc600"} Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.515933 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.654767 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.662828 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs8wz\" (UniqueName: \"kubernetes.io/projected/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-kube-api-access-cs8wz\") pod \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.663029 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-config-data\") pod \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.663138 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-combined-ca-bundle\") pod \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\" (UID: \"d6333e0c-df36-41f4-9efa-f3b1c161fa9a\") " Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.674130 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-kube-api-access-cs8wz" (OuterVolumeSpecName: "kube-api-access-cs8wz") pod "d6333e0c-df36-41f4-9efa-f3b1c161fa9a" (UID: "d6333e0c-df36-41f4-9efa-f3b1c161fa9a"). InnerVolumeSpecName "kube-api-access-cs8wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.710764 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6333e0c-df36-41f4-9efa-f3b1c161fa9a" (UID: "d6333e0c-df36-41f4-9efa-f3b1c161fa9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.723553 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-config-data" (OuterVolumeSpecName: "config-data") pod "d6333e0c-df36-41f4-9efa-f3b1c161fa9a" (UID: "d6333e0c-df36-41f4-9efa-f3b1c161fa9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.766992 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-utilities\") pod \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.767076 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxsl\" (UniqueName: \"kubernetes.io/projected/f5d305d0-ab00-4c29-b7d4-687dd2e46193-kube-api-access-bxxsl\") pod \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.767137 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-catalog-content\") pod \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\" (UID: \"f5d305d0-ab00-4c29-b7d4-687dd2e46193\") " Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.767696 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.767710 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.767723 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs8wz\" (UniqueName: \"kubernetes.io/projected/d6333e0c-df36-41f4-9efa-f3b1c161fa9a-kube-api-access-cs8wz\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.770403 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-utilities" (OuterVolumeSpecName: "utilities") pod "f5d305d0-ab00-4c29-b7d4-687dd2e46193" (UID: "f5d305d0-ab00-4c29-b7d4-687dd2e46193"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.776795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d305d0-ab00-4c29-b7d4-687dd2e46193-kube-api-access-bxxsl" (OuterVolumeSpecName: "kube-api-access-bxxsl") pod "f5d305d0-ab00-4c29-b7d4-687dd2e46193" (UID: "f5d305d0-ab00-4c29-b7d4-687dd2e46193"). InnerVolumeSpecName "kube-api-access-bxxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.871060 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.871095 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxsl\" (UniqueName: \"kubernetes.io/projected/f5d305d0-ab00-4c29-b7d4-687dd2e46193-kube-api-access-bxxsl\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.907866 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5d305d0-ab00-4c29-b7d4-687dd2e46193" (UID: "f5d305d0-ab00-4c29-b7d4-687dd2e46193"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:23 crc kubenswrapper[4762]: I0217 14:33:23.985451 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5d305d0-ab00-4c29-b7d4-687dd2e46193-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.133432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerStarted","Data":"c641259fcfe18ad927f55ddd072c2c5c6e92fd54f2727319d179dab669921205"} Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.133762 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.153006 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d6333e0c-df36-41f4-9efa-f3b1c161fa9a","Type":"ContainerDied","Data":"2c1230549bb0a9c609872f87d553791aef4556bd623645cbd474401369ea51f5"} Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.153087 4762 scope.go:117] "RemoveContainer" containerID="265976f262e9c2b001b72753aa8e69799c1f6e7118b1c455d40777e503ecc600" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.154087 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.167684 4762 generic.go:334] "Generic (PLEG): container finished" podID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerID="97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615" exitCode=0 Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.167776 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerDied","Data":"97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615"} Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.167798 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxsl2" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.167818 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxsl2" event={"ID":"f5d305d0-ab00-4c29-b7d4-687dd2e46193","Type":"ContainerDied","Data":"29a82c160b6f08ce019366202cc92092b79121a7e11c71afa6a4eecda5aa4133"} Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.180302 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.880270864 podStartE2EDuration="6.180267648s" podCreationTimestamp="2026-02-17 14:33:18 +0000 UTC" firstStartedPulling="2026-02-17 14:33:19.66067142 +0000 UTC m=+1680.240672072" lastFinishedPulling="2026-02-17 14:33:22.960668204 +0000 UTC m=+1683.540668856" observedRunningTime="2026-02-17 14:33:24.170815271 +0000 UTC m=+1684.750815943" watchObservedRunningTime="2026-02-17 14:33:24.180267648 +0000 UTC m=+1684.760268310" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.213116 4762 scope.go:117] "RemoveContainer" containerID="97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.239795 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.260801 4762 scope.go:117] "RemoveContainer" containerID="b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.273690 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.287459 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxsl2"] Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.317291 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pxsl2"] Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.344767 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.345281 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6333e0c-df36-41f4-9efa-f3b1c161fa9a" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.345298 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6333e0c-df36-41f4-9efa-f3b1c161fa9a" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.345319 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="extract-utilities" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.345326 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="extract-utilities" Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.345342 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="registry-server" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.345349 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="registry-server" Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.345361 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="extract-content" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.345366 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="extract-content" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.345589 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" containerName="registry-server" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.345619 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6333e0c-df36-41f4-9efa-f3b1c161fa9a" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.346425 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.346850 4762 scope.go:117] "RemoveContainer" containerID="6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.351228 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.351228 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.351593 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.366280 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.397000 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.397190 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.397377 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q86jb\" (UniqueName: \"kubernetes.io/projected/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-kube-api-access-q86jb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.397446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.397564 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.422025 4762 scope.go:117] "RemoveContainer" containerID="97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615" Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.424160 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615\": container with ID starting with 97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615 not found: ID does not exist" containerID="97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.424217 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615"} err="failed to get container status \"97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615\": rpc error: code = NotFound desc = could not find container \"97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615\": container with ID starting with 97f03dd47ff141f0824e9c778ef3ac5fc2c6fa0f9d84e902ffdccc8f03f03615 not found: ID does not exist" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.424251 4762 scope.go:117] "RemoveContainer" containerID="b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37" Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.426256 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37\": container with ID starting with b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37 not found: ID does not exist" containerID="b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.426301 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37"} err="failed to get container status \"b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37\": rpc error: code = NotFound desc = could not find container \"b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37\": container with ID starting with b681f151a3535f2a04e3eb5a09ff5461a4407970081b998ffe6c0d645fa9dc37 not found: ID does not exist" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.426330 4762 scope.go:117] "RemoveContainer" containerID="6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb" Feb 17 14:33:24 crc kubenswrapper[4762]: E0217 14:33:24.430859 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb\": container with ID starting with 6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb not found: ID does not exist" containerID="6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.430896 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb"} err="failed to get container status \"6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb\": rpc error: code = NotFound desc = could not find container \"6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb\": container with ID starting with 6a66ac9c5c9e7ff640c870cd09e32814c28e87c5e80d5ad473369acbb3e4b4cb not found: ID does not exist" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.499353 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q86jb\" (UniqueName: \"kubernetes.io/projected/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-kube-api-access-q86jb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.499435 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.499517 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.499555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.499675 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.513348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.517122 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.518525 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.524273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.528453 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q86jb\" (UniqueName: \"kubernetes.io/projected/a388c0a6-5d6a-4d70-8527-40ae2f62eca4-kube-api-access-q86jb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a388c0a6-5d6a-4d70-8527-40ae2f62eca4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:24 crc kubenswrapper[4762]: I0217 14:33:24.682781 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:25 crc kubenswrapper[4762]: I0217 14:33:25.171981 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:33:26 crc kubenswrapper[4762]: I0217 14:33:26.084408 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6333e0c-df36-41f4-9efa-f3b1c161fa9a" path="/var/lib/kubelet/pods/d6333e0c-df36-41f4-9efa-f3b1c161fa9a/volumes" Feb 17 14:33:26 crc kubenswrapper[4762]: I0217 14:33:26.085307 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d305d0-ab00-4c29-b7d4-687dd2e46193" path="/var/lib/kubelet/pods/f5d305d0-ab00-4c29-b7d4-687dd2e46193/volumes" Feb 17 14:33:26 crc kubenswrapper[4762]: I0217 14:33:26.203095 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a388c0a6-5d6a-4d70-8527-40ae2f62eca4","Type":"ContainerStarted","Data":"b0797dcfad9bd64b8881ef4c7197731bcc0207279cc09a3d537a571b96810f3b"} Feb 17 14:33:26 crc kubenswrapper[4762]: I0217 14:33:26.203396 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a388c0a6-5d6a-4d70-8527-40ae2f62eca4","Type":"ContainerStarted","Data":"f37f9932f8f9fa54f2884437a24046c67282f3db587e2e34eaa5e9dca3344f3a"} Feb 17 14:33:26 crc kubenswrapper[4762]: I0217 14:33:26.232426 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.232403093 podStartE2EDuration="2.232403093s" podCreationTimestamp="2026-02-17 14:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:26.225167467 +0000 UTC m=+1686.805168119" watchObservedRunningTime="2026-02-17 14:33:26.232403093 +0000 UTC m=+1686.812403745" Feb 17 14:33:27 crc kubenswrapper[4762]: I0217 14:33:27.135971 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:33:27 crc kubenswrapper[4762]: I0217 14:33:27.136086 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:33:27 crc kubenswrapper[4762]: I0217 14:33:27.143120 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:33:27 crc kubenswrapper[4762]: I0217 14:33:27.143256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:33:29 crc kubenswrapper[4762]: I0217 14:33:29.683388 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.071934 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:33:30 crc kubenswrapper[4762]: E0217 14:33:30.072429 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.265555 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.265877 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.266988 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.267035 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.270823 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.270900 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.564577 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-z9jpf"] Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.573012 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.610743 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-z9jpf"] Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.689542 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5ljd\" (UniqueName: \"kubernetes.io/projected/7ee8353e-dc34-46ac-ace9-d0de5574c65b-kube-api-access-w5ljd\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.689636 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.689812 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.689908 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.689949 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.690126 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-config\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.791832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-config\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.791980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5ljd\" (UniqueName: \"kubernetes.io/projected/7ee8353e-dc34-46ac-ace9-d0de5574c65b-kube-api-access-w5ljd\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792110 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792165 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792953 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792958 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-config\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.792966 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.793077 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.793631 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ee8353e-dc34-46ac-ace9-d0de5574c65b-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.815085 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5ljd\" (UniqueName: \"kubernetes.io/projected/7ee8353e-dc34-46ac-ace9-d0de5574c65b-kube-api-access-w5ljd\") pod \"dnsmasq-dns-f84f9ccf-z9jpf\" (UID: \"7ee8353e-dc34-46ac-ace9-d0de5574c65b\") " pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:30 crc kubenswrapper[4762]: I0217 14:33:30.909963 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:31 crc kubenswrapper[4762]: I0217 14:33:31.516872 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-z9jpf"] Feb 17 14:33:32 crc kubenswrapper[4762]: I0217 14:33:32.303594 4762 generic.go:334] "Generic (PLEG): container finished" podID="7ee8353e-dc34-46ac-ace9-d0de5574c65b" containerID="6972b3190a5ed02328121cc4c9be232185616cbce8ead4f67e13ed8e6e026969" exitCode=0 Feb 17 14:33:32 crc kubenswrapper[4762]: I0217 14:33:32.303724 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" event={"ID":"7ee8353e-dc34-46ac-ace9-d0de5574c65b","Type":"ContainerDied","Data":"6972b3190a5ed02328121cc4c9be232185616cbce8ead4f67e13ed8e6e026969"} Feb 17 14:33:32 crc kubenswrapper[4762]: I0217 14:33:32.304029 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" event={"ID":"7ee8353e-dc34-46ac-ace9-d0de5574c65b","Type":"ContainerStarted","Data":"88614e384bd82f67a4aa3d376d1a153e82822db2ed00c224d71001227ad2e125"} Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.068312 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.318552 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" event={"ID":"7ee8353e-dc34-46ac-ace9-d0de5574c65b","Type":"ContainerStarted","Data":"56f3e492170869319625810a93be4d6df1a2f236cfd768640e4c151909f243e7"} Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.318701 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-log" containerID="cri-o://74fb564dbde7810e1263c381fdd6bf91af1b9ea2163631f38629d22afc8d3097" gracePeriod=30 Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.318782 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-api" containerID="cri-o://4f14bc89881c6bbb12f27ae85df3b3fea2f73bd13540648b9480b82879ea3abd" gracePeriod=30 Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.355880 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" podStartSLOduration=3.355856298 podStartE2EDuration="3.355856298s" podCreationTimestamp="2026-02-17 14:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:33.348704684 +0000 UTC m=+1693.928705356" watchObservedRunningTime="2026-02-17 14:33:33.355856298 +0000 UTC m=+1693.935856950" Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.518565 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.518850 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-central-agent" containerID="cri-o://050870f762b8e02712ad722fc90022cfdff99cd3054b7165e45a68db06297785" gracePeriod=30 Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.518967 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-notification-agent" containerID="cri-o://3fc17f561123ba0f7d6bfaf8be800de6b8947e1d6d2dd298963a7e6a8715d28c" gracePeriod=30 Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.518994 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="proxy-httpd" containerID="cri-o://c641259fcfe18ad927f55ddd072c2c5c6e92fd54f2727319d179dab669921205" gracePeriod=30 Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.518942 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="sg-core" containerID="cri-o://dba79217483916a4ef79968592b5deebbda18e2c78a1f0d5009a7cb247a213ac" gracePeriod=30 Feb 17 14:33:33 crc kubenswrapper[4762]: I0217 14:33:33.533781 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.1:3000/\": EOF" Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.332506 4762 generic.go:334] "Generic (PLEG): container finished" podID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerID="c641259fcfe18ad927f55ddd072c2c5c6e92fd54f2727319d179dab669921205" exitCode=0 Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.332816 4762 generic.go:334] "Generic (PLEG): container finished" podID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerID="dba79217483916a4ef79968592b5deebbda18e2c78a1f0d5009a7cb247a213ac" exitCode=2 Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.332830 4762 generic.go:334] "Generic (PLEG): container finished" podID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerID="050870f762b8e02712ad722fc90022cfdff99cd3054b7165e45a68db06297785" exitCode=0 Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.332579 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerDied","Data":"c641259fcfe18ad927f55ddd072c2c5c6e92fd54f2727319d179dab669921205"} Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.332901 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerDied","Data":"dba79217483916a4ef79968592b5deebbda18e2c78a1f0d5009a7cb247a213ac"} Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.332919 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerDied","Data":"050870f762b8e02712ad722fc90022cfdff99cd3054b7165e45a68db06297785"} Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.335404 4762 generic.go:334] "Generic (PLEG): container finished" podID="12165630-4428-4b61-a595-eec93ce5938d" containerID="74fb564dbde7810e1263c381fdd6bf91af1b9ea2163631f38629d22afc8d3097" exitCode=143 Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.335480 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12165630-4428-4b61-a595-eec93ce5938d","Type":"ContainerDied","Data":"74fb564dbde7810e1263c381fdd6bf91af1b9ea2163631f38629d22afc8d3097"} Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.335685 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.683828 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:34 crc kubenswrapper[4762]: I0217 14:33:34.703961 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.353422 4762 generic.go:334] "Generic (PLEG): container finished" podID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerID="3fc17f561123ba0f7d6bfaf8be800de6b8947e1d6d2dd298963a7e6a8715d28c" exitCode=0 Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.353697 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerDied","Data":"3fc17f561123ba0f7d6bfaf8be800de6b8947e1d6d2dd298963a7e6a8715d28c"} Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.370879 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.605701 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hmbsl"] Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.607319 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.610014 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.610312 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.648401 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hmbsl"] Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.680415 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.680504 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-config-data\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.680537 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-scripts\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.684093 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65r4p\" (UniqueName: \"kubernetes.io/projected/c15862fc-7a11-484e-8343-c565ddcc60eb-kube-api-access-65r4p\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.787773 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.788174 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-config-data\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.788206 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-scripts\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.788681 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65r4p\" (UniqueName: \"kubernetes.io/projected/c15862fc-7a11-484e-8343-c565ddcc60eb-kube-api-access-65r4p\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.797256 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-config-data\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.797439 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.805469 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-scripts\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.815526 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65r4p\" (UniqueName: \"kubernetes.io/projected/c15862fc-7a11-484e-8343-c565ddcc60eb-kube-api-access-65r4p\") pod \"nova-cell1-cell-mapping-hmbsl\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.934401 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.961537 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995266 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4ws6\" (UniqueName: \"kubernetes.io/projected/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-kube-api-access-p4ws6\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995366 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-run-httpd\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995448 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-sg-core-conf-yaml\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995472 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-scripts\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995520 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-log-httpd\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995552 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-config-data\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.995571 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-combined-ca-bundle\") pod \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\" (UID: \"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75\") " Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.996226 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:35 crc kubenswrapper[4762]: I0217 14:33:35.996626 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.002027 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-scripts" (OuterVolumeSpecName: "scripts") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.002824 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-kube-api-access-p4ws6" (OuterVolumeSpecName: "kube-api-access-p4ws6") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "kube-api-access-p4ws6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.060597 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.109489 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.109545 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.109560 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.109576 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4ws6\" (UniqueName: \"kubernetes.io/projected/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-kube-api-access-p4ws6\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.109594 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.206283 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.214991 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.252501 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-config-data" (OuterVolumeSpecName: "config-data") pod "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" (UID: "d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.318785 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.376040 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.376677 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75","Type":"ContainerDied","Data":"8335223652f38d6ac0fe517cc512661f0ed97b1507173ea0d7ec73c25a7848cb"} Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.376767 4762 scope.go:117] "RemoveContainer" containerID="c641259fcfe18ad927f55ddd072c2c5c6e92fd54f2727319d179dab669921205" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.456616 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.468996 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.490018 4762 scope.go:117] "RemoveContainer" containerID="dba79217483916a4ef79968592b5deebbda18e2c78a1f0d5009a7cb247a213ac" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.499312 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:36 crc kubenswrapper[4762]: E0217 14:33:36.499922 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-notification-agent" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.499936 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-notification-agent" Feb 17 14:33:36 crc kubenswrapper[4762]: E0217 14:33:36.499966 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="sg-core" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.499972 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="sg-core" Feb 17 14:33:36 crc kubenswrapper[4762]: E0217 14:33:36.500001 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="proxy-httpd" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.500007 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="proxy-httpd" Feb 17 14:33:36 crc kubenswrapper[4762]: E0217 14:33:36.500028 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-central-agent" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.500034 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-central-agent" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.500284 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="sg-core" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.500299 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="proxy-httpd" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.500321 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-notification-agent" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.501053 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" containerName="ceilometer-central-agent" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.504583 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.509148 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.509424 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.527616 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-log-httpd\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.528557 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzddc\" (UniqueName: \"kubernetes.io/projected/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-kube-api-access-tzddc\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.528708 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-scripts\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.528768 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-config-data\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.528815 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.529039 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-run-httpd\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.529076 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.535502 4762 scope.go:117] "RemoveContainer" containerID="3fc17f561123ba0f7d6bfaf8be800de6b8947e1d6d2dd298963a7e6a8715d28c" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.537001 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.565226 4762 scope.go:117] "RemoveContainer" containerID="050870f762b8e02712ad722fc90022cfdff99cd3054b7165e45a68db06297785" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.603957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hmbsl"] Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.648976 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-scripts\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.651821 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-config-data\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.656117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.656558 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-run-httpd\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.656789 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.657116 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-log-httpd\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.657188 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-run-httpd\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.657370 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzddc\" (UniqueName: \"kubernetes.io/projected/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-kube-api-access-tzddc\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.657923 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-log-httpd\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.659879 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-config-data\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.661797 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-scripts\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.664912 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.705034 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.711223 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzddc\" (UniqueName: \"kubernetes.io/projected/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-kube-api-access-tzddc\") pod \"ceilometer-0\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.841038 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:36 crc kubenswrapper[4762]: I0217 14:33:36.903397 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.412398 4762 generic.go:334] "Generic (PLEG): container finished" podID="12165630-4428-4b61-a595-eec93ce5938d" containerID="4f14bc89881c6bbb12f27ae85df3b3fea2f73bd13540648b9480b82879ea3abd" exitCode=0 Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.412559 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12165630-4428-4b61-a595-eec93ce5938d","Type":"ContainerDied","Data":"4f14bc89881c6bbb12f27ae85df3b3fea2f73bd13540648b9480b82879ea3abd"} Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.415246 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hmbsl" event={"ID":"c15862fc-7a11-484e-8343-c565ddcc60eb","Type":"ContainerStarted","Data":"10ad82c58238c4240e79389188f39d3f2d9317fe5cd32047ec2ba297ccc9e5d5"} Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.531900 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.613936 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgm5b\" (UniqueName: \"kubernetes.io/projected/12165630-4428-4b61-a595-eec93ce5938d-kube-api-access-pgm5b\") pod \"12165630-4428-4b61-a595-eec93ce5938d\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.614177 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-config-data\") pod \"12165630-4428-4b61-a595-eec93ce5938d\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.614834 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-combined-ca-bundle\") pod \"12165630-4428-4b61-a595-eec93ce5938d\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.614938 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12165630-4428-4b61-a595-eec93ce5938d-logs\") pod \"12165630-4428-4b61-a595-eec93ce5938d\" (UID: \"12165630-4428-4b61-a595-eec93ce5938d\") " Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.616233 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12165630-4428-4b61-a595-eec93ce5938d-logs" (OuterVolumeSpecName: "logs") pod "12165630-4428-4b61-a595-eec93ce5938d" (UID: "12165630-4428-4b61-a595-eec93ce5938d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.620861 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12165630-4428-4b61-a595-eec93ce5938d-kube-api-access-pgm5b" (OuterVolumeSpecName: "kube-api-access-pgm5b") pod "12165630-4428-4b61-a595-eec93ce5938d" (UID: "12165630-4428-4b61-a595-eec93ce5938d"). InnerVolumeSpecName "kube-api-access-pgm5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.634699 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.671162 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12165630-4428-4b61-a595-eec93ce5938d" (UID: "12165630-4428-4b61-a595-eec93ce5938d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.677395 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-config-data" (OuterVolumeSpecName: "config-data") pod "12165630-4428-4b61-a595-eec93ce5938d" (UID: "12165630-4428-4b61-a595-eec93ce5938d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.717924 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.717959 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12165630-4428-4b61-a595-eec93ce5938d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.717969 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgm5b\" (UniqueName: \"kubernetes.io/projected/12165630-4428-4b61-a595-eec93ce5938d-kube-api-access-pgm5b\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:37 crc kubenswrapper[4762]: I0217 14:33:37.717981 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12165630-4428-4b61-a595-eec93ce5938d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.086598 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75" path="/var/lib/kubelet/pods/d2d8bffc-38fc-4c9f-bd93-629c8f6aaf75/volumes" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.499074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerStarted","Data":"4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9"} Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.500749 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerStarted","Data":"53a4b3bd63f44293085152d9f7d4b85cea85461ca8a361004a85eadfc54b7fd9"} Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.501995 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hmbsl" event={"ID":"c15862fc-7a11-484e-8343-c565ddcc60eb","Type":"ContainerStarted","Data":"5eec962dd211446ef8a8f7d17ba4922b5ce36ef85cec693ce7a62710fce9a4f5"} Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.504904 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12165630-4428-4b61-a595-eec93ce5938d","Type":"ContainerDied","Data":"f4cbba9bde8c54b8ec212f2e279f6679cc2af812e3ed91ddbcfd203cac5396ff"} Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.504972 4762 scope.go:117] "RemoveContainer" containerID="4f14bc89881c6bbb12f27ae85df3b3fea2f73bd13540648b9480b82879ea3abd" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.505171 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.529890 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hmbsl" podStartSLOduration=3.529866705 podStartE2EDuration="3.529866705s" podCreationTimestamp="2026-02-17 14:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:38.520698326 +0000 UTC m=+1699.100698988" watchObservedRunningTime="2026-02-17 14:33:38.529866705 +0000 UTC m=+1699.109867357" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.556742 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.556847 4762 scope.go:117] "RemoveContainer" containerID="74fb564dbde7810e1263c381fdd6bf91af1b9ea2163631f38629d22afc8d3097" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.575035 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.607983 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:38 crc kubenswrapper[4762]: E0217 14:33:38.608681 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-api" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.608707 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-api" Feb 17 14:33:38 crc kubenswrapper[4762]: E0217 14:33:38.608732 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-log" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.608742 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-log" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.609074 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-log" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.609133 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="12165630-4428-4b61-a595-eec93ce5938d" containerName="nova-api-api" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.611045 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.614414 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.615871 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.615945 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.623770 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.793347 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jwf\" (UniqueName: \"kubernetes.io/projected/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-kube-api-access-j7jwf\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.793598 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.794147 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-logs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.794602 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-config-data\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.794753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.794888 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.897729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.897823 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-logs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.898050 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-config-data\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.898094 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.898127 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.898223 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jwf\" (UniqueName: \"kubernetes.io/projected/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-kube-api-access-j7jwf\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.898363 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-logs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.906617 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.908374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.909079 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-config-data\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.913161 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.918106 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jwf\" (UniqueName: \"kubernetes.io/projected/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-kube-api-access-j7jwf\") pod \"nova-api-0\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " pod="openstack/nova-api-0" Feb 17 14:33:38 crc kubenswrapper[4762]: I0217 14:33:38.944857 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:39 crc kubenswrapper[4762]: I0217 14:33:39.500993 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:39 crc kubenswrapper[4762]: I0217 14:33:39.564823 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerStarted","Data":"fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a"} Feb 17 14:33:39 crc kubenswrapper[4762]: I0217 14:33:39.568796 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6b24e9c-a819-4791-b2d5-97e4e56a22c1","Type":"ContainerStarted","Data":"e469f769a84e51e0d3176297ee696952ae60b7f50f0b77299dd41e0f420f32d9"} Feb 17 14:33:40 crc kubenswrapper[4762]: I0217 14:33:40.098615 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12165630-4428-4b61-a595-eec93ce5938d" path="/var/lib/kubelet/pods/12165630-4428-4b61-a595-eec93ce5938d/volumes" Feb 17 14:33:40 crc kubenswrapper[4762]: I0217 14:33:40.582447 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerStarted","Data":"38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699"} Feb 17 14:33:40 crc kubenswrapper[4762]: I0217 14:33:40.586128 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6b24e9c-a819-4791-b2d5-97e4e56a22c1","Type":"ContainerStarted","Data":"887e15ad19fc27a12c37952a2b9950f8a8812e9e7a0510cec185fc9d3fd62b66"} Feb 17 14:33:40 crc kubenswrapper[4762]: I0217 14:33:40.911839 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-z9jpf" Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.008569 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-ktxq9"] Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.011206 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="dnsmasq-dns" containerID="cri-o://3d963d3c523250d1170368819f3f00deb0ad2568068ffec474e10de1da127b5b" gracePeriod=10 Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.162635 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.248:5353: connect: connection refused" Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.611913 4762 generic.go:334] "Generic (PLEG): container finished" podID="017f582c-a428-4df1-85e2-955bd88c9b26" containerID="3d963d3c523250d1170368819f3f00deb0ad2568068ffec474e10de1da127b5b" exitCode=0 Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.612325 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" event={"ID":"017f582c-a428-4df1-85e2-955bd88c9b26","Type":"ContainerDied","Data":"3d963d3c523250d1170368819f3f00deb0ad2568068ffec474e10de1da127b5b"} Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.615526 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6b24e9c-a819-4791-b2d5-97e4e56a22c1","Type":"ContainerStarted","Data":"47c45593fb8aba9e37a2a183212858aca006aa1eb329e1e177dd0ccb9fe0095a"} Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.655852 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.655832359 podStartE2EDuration="3.655832359s" podCreationTimestamp="2026-02-17 14:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:41.636604027 +0000 UTC m=+1702.216604669" watchObservedRunningTime="2026-02-17 14:33:41.655832359 +0000 UTC m=+1702.235833011" Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.802046 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.977437 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-config\") pod \"017f582c-a428-4df1-85e2-955bd88c9b26\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.977899 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-svc\") pod \"017f582c-a428-4df1-85e2-955bd88c9b26\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.978205 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-swift-storage-0\") pod \"017f582c-a428-4df1-85e2-955bd88c9b26\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.979839 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjbl\" (UniqueName: \"kubernetes.io/projected/017f582c-a428-4df1-85e2-955bd88c9b26-kube-api-access-dqjbl\") pod \"017f582c-a428-4df1-85e2-955bd88c9b26\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.979932 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-nb\") pod \"017f582c-a428-4df1-85e2-955bd88c9b26\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.980129 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-sb\") pod \"017f582c-a428-4df1-85e2-955bd88c9b26\" (UID: \"017f582c-a428-4df1-85e2-955bd88c9b26\") " Feb 17 14:33:41 crc kubenswrapper[4762]: I0217 14:33:41.992147 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017f582c-a428-4df1-85e2-955bd88c9b26-kube-api-access-dqjbl" (OuterVolumeSpecName: "kube-api-access-dqjbl") pod "017f582c-a428-4df1-85e2-955bd88c9b26" (UID: "017f582c-a428-4df1-85e2-955bd88c9b26"). InnerVolumeSpecName "kube-api-access-dqjbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.068072 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "017f582c-a428-4df1-85e2-955bd88c9b26" (UID: "017f582c-a428-4df1-85e2-955bd88c9b26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.091685 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.091729 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqjbl\" (UniqueName: \"kubernetes.io/projected/017f582c-a428-4df1-85e2-955bd88c9b26-kube-api-access-dqjbl\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.128038 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "017f582c-a428-4df1-85e2-955bd88c9b26" (UID: "017f582c-a428-4df1-85e2-955bd88c9b26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.152096 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "017f582c-a428-4df1-85e2-955bd88c9b26" (UID: "017f582c-a428-4df1-85e2-955bd88c9b26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.164468 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "017f582c-a428-4df1-85e2-955bd88c9b26" (UID: "017f582c-a428-4df1-85e2-955bd88c9b26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.170998 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-config" (OuterVolumeSpecName: "config") pod "017f582c-a428-4df1-85e2-955bd88c9b26" (UID: "017f582c-a428-4df1-85e2-955bd88c9b26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.194700 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.194733 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.194746 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.194758 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/017f582c-a428-4df1-85e2-955bd88c9b26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.628327 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" event={"ID":"017f582c-a428-4df1-85e2-955bd88c9b26","Type":"ContainerDied","Data":"d6b57840b8086c9e15ec5808e20de28c7ad8a04eff43787bc78252ea4af3a28d"} Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.628401 4762 scope.go:117] "RemoveContainer" containerID="3d963d3c523250d1170368819f3f00deb0ad2568068ffec474e10de1da127b5b" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.628573 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-ktxq9" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.633575 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-central-agent" containerID="cri-o://4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9" gracePeriod=30 Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.633619 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerStarted","Data":"cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876"} Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.633999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.633801 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="proxy-httpd" containerID="cri-o://cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876" gracePeriod=30 Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.633819 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-notification-agent" containerID="cri-o://fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a" gracePeriod=30 Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.633780 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="sg-core" containerID="cri-o://38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699" gracePeriod=30 Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.659383 4762 scope.go:117] "RemoveContainer" containerID="ba1d0114d094f9fc0b08a3e520d6413062ad123cbd491490c0d46ab67c5e0859" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.692631 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.968542313 podStartE2EDuration="6.692606022s" podCreationTimestamp="2026-02-17 14:33:36 +0000 UTC" firstStartedPulling="2026-02-17 14:33:37.596715416 +0000 UTC m=+1698.176716058" lastFinishedPulling="2026-02-17 14:33:41.320779115 +0000 UTC m=+1701.900779767" observedRunningTime="2026-02-17 14:33:42.671319804 +0000 UTC m=+1703.251320456" watchObservedRunningTime="2026-02-17 14:33:42.692606022 +0000 UTC m=+1703.272606674" Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.732696 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-ktxq9"] Feb 17 14:33:42 crc kubenswrapper[4762]: I0217 14:33:42.748340 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-ktxq9"] Feb 17 14:33:43 crc kubenswrapper[4762]: I0217 14:33:43.649618 4762 generic.go:334] "Generic (PLEG): container finished" podID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerID="cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876" exitCode=0 Feb 17 14:33:43 crc kubenswrapper[4762]: I0217 14:33:43.649907 4762 generic.go:334] "Generic (PLEG): container finished" podID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerID="38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699" exitCode=2 Feb 17 14:33:43 crc kubenswrapper[4762]: I0217 14:33:43.649918 4762 generic.go:334] "Generic (PLEG): container finished" podID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerID="fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a" exitCode=0 Feb 17 14:33:43 crc kubenswrapper[4762]: I0217 14:33:43.649821 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerDied","Data":"cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876"} Feb 17 14:33:43 crc kubenswrapper[4762]: I0217 14:33:43.649955 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerDied","Data":"38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699"} Feb 17 14:33:43 crc kubenswrapper[4762]: I0217 14:33:43.649971 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerDied","Data":"fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a"} Feb 17 14:33:44 crc kubenswrapper[4762]: I0217 14:33:44.071629 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:33:44 crc kubenswrapper[4762]: E0217 14:33:44.072005 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:33:44 crc kubenswrapper[4762]: I0217 14:33:44.084499 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" path="/var/lib/kubelet/pods/017f582c-a428-4df1-85e2-955bd88c9b26/volumes" Feb 17 14:33:44 crc kubenswrapper[4762]: I0217 14:33:44.663238 4762 generic.go:334] "Generic (PLEG): container finished" podID="c15862fc-7a11-484e-8343-c565ddcc60eb" containerID="5eec962dd211446ef8a8f7d17ba4922b5ce36ef85cec693ce7a62710fce9a4f5" exitCode=0 Feb 17 14:33:44 crc kubenswrapper[4762]: I0217 14:33:44.663283 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hmbsl" event={"ID":"c15862fc-7a11-484e-8343-c565ddcc60eb","Type":"ContainerDied","Data":"5eec962dd211446ef8a8f7d17ba4922b5ce36ef85cec693ce7a62710fce9a4f5"} Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.297430 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.419897 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-scripts\") pod \"c15862fc-7a11-484e-8343-c565ddcc60eb\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.420156 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-combined-ca-bundle\") pod \"c15862fc-7a11-484e-8343-c565ddcc60eb\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.420220 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65r4p\" (UniqueName: \"kubernetes.io/projected/c15862fc-7a11-484e-8343-c565ddcc60eb-kube-api-access-65r4p\") pod \"c15862fc-7a11-484e-8343-c565ddcc60eb\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.420439 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-config-data\") pod \"c15862fc-7a11-484e-8343-c565ddcc60eb\" (UID: \"c15862fc-7a11-484e-8343-c565ddcc60eb\") " Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.426784 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-scripts" (OuterVolumeSpecName: "scripts") pod "c15862fc-7a11-484e-8343-c565ddcc60eb" (UID: "c15862fc-7a11-484e-8343-c565ddcc60eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.427049 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15862fc-7a11-484e-8343-c565ddcc60eb-kube-api-access-65r4p" (OuterVolumeSpecName: "kube-api-access-65r4p") pod "c15862fc-7a11-484e-8343-c565ddcc60eb" (UID: "c15862fc-7a11-484e-8343-c565ddcc60eb"). InnerVolumeSpecName "kube-api-access-65r4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.465946 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-config-data" (OuterVolumeSpecName: "config-data") pod "c15862fc-7a11-484e-8343-c565ddcc60eb" (UID: "c15862fc-7a11-484e-8343-c565ddcc60eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.479464 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c15862fc-7a11-484e-8343-c565ddcc60eb" (UID: "c15862fc-7a11-484e-8343-c565ddcc60eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.523567 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.523608 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.523617 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15862fc-7a11-484e-8343-c565ddcc60eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.523627 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65r4p\" (UniqueName: \"kubernetes.io/projected/c15862fc-7a11-484e-8343-c565ddcc60eb-kube-api-access-65r4p\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.828702 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hmbsl" event={"ID":"c15862fc-7a11-484e-8343-c565ddcc60eb","Type":"ContainerDied","Data":"10ad82c58238c4240e79389188f39d3f2d9317fe5cd32047ec2ba297ccc9e5d5"} Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.829007 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10ad82c58238c4240e79389188f39d3f2d9317fe5cd32047ec2ba297ccc9e5d5" Feb 17 14:33:46 crc kubenswrapper[4762]: I0217 14:33:46.828798 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hmbsl" Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:46.997603 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:46.997937 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-log" containerID="cri-o://887e15ad19fc27a12c37952a2b9950f8a8812e9e7a0510cec185fc9d3fd62b66" gracePeriod=30 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:46.998548 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-api" containerID="cri-o://47c45593fb8aba9e37a2a183212858aca006aa1eb329e1e177dd0ccb9fe0095a" gracePeriod=30 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.012453 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.012708 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" containerName="nova-scheduler-scheduler" containerID="cri-o://00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" gracePeriod=30 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.063787 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.064039 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-log" containerID="cri-o://e516c2d595f01a19af1b3b7531bf2bd3e4520e05d113cb97d33cbdbed416b182" gracePeriod=30 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.064326 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-metadata" containerID="cri-o://59773f5a9db93ad22b346d36f4b50875a85c9b2c4b699bcec80eb85aa725692e" gracePeriod=30 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.854197 4762 generic.go:334] "Generic (PLEG): container finished" podID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerID="e516c2d595f01a19af1b3b7531bf2bd3e4520e05d113cb97d33cbdbed416b182" exitCode=143 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.854578 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6bb9f998-3134-4e4b-91ee-6ee679264798","Type":"ContainerDied","Data":"e516c2d595f01a19af1b3b7531bf2bd3e4520e05d113cb97d33cbdbed416b182"} Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.858555 4762 generic.go:334] "Generic (PLEG): container finished" podID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerID="47c45593fb8aba9e37a2a183212858aca006aa1eb329e1e177dd0ccb9fe0095a" exitCode=0 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.858621 4762 generic.go:334] "Generic (PLEG): container finished" podID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerID="887e15ad19fc27a12c37952a2b9950f8a8812e9e7a0510cec185fc9d3fd62b66" exitCode=143 Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.858629 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6b24e9c-a819-4791-b2d5-97e4e56a22c1","Type":"ContainerDied","Data":"47c45593fb8aba9e37a2a183212858aca006aa1eb329e1e177dd0ccb9fe0095a"} Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.858712 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6b24e9c-a819-4791-b2d5-97e4e56a22c1","Type":"ContainerDied","Data":"887e15ad19fc27a12c37952a2b9950f8a8812e9e7a0510cec185fc9d3fd62b66"} Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.858723 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f6b24e9c-a819-4791-b2d5-97e4e56a22c1","Type":"ContainerDied","Data":"e469f769a84e51e0d3176297ee696952ae60b7f50f0b77299dd41e0f420f32d9"} Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.858733 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e469f769a84e51e0d3176297ee696952ae60b7f50f0b77299dd41e0f420f32d9" Feb 17 14:33:47 crc kubenswrapper[4762]: I0217 14:33:47.929019 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.029912 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-combined-ca-bundle\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.030163 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.030213 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-logs\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.030275 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jwf\" (UniqueName: \"kubernetes.io/projected/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-kube-api-access-j7jwf\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.030336 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-internal-tls-certs\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.030447 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-config-data\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.031123 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-logs" (OuterVolumeSpecName: "logs") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.031267 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.059862 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-kube-api-access-j7jwf" (OuterVolumeSpecName: "kube-api-access-j7jwf") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "kube-api-access-j7jwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.142496 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jwf\" (UniqueName: \"kubernetes.io/projected/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-kube-api-access-j7jwf\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.366068 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-config-data" (OuterVolumeSpecName: "config-data") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.376012 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.421867 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.431038 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs\") pod \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\" (UID: \"f6b24e9c-a819-4791-b2d5-97e4e56a22c1\") " Feb 17 14:33:48 crc kubenswrapper[4762]: W0217 14:33:48.431177 4762 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f6b24e9c-a819-4791-b2d5-97e4e56a22c1/volumes/kubernetes.io~secret/public-tls-certs Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.431202 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.432324 4762 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.432341 4762 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.432352 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.446956 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6b24e9c-a819-4791-b2d5-97e4e56a22c1" (UID: "f6b24e9c-a819-4791-b2d5-97e4e56a22c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.534128 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b24e9c-a819-4791-b2d5-97e4e56a22c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.874590 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.929158 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:48 crc kubenswrapper[4762]: I0217 14:33:48.962779 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:48.999534 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:49 crc kubenswrapper[4762]: E0217 14:33:49.000181 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-log" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000194 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-log" Feb 17 14:33:49 crc kubenswrapper[4762]: E0217 14:33:49.000230 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-api" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000238 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-api" Feb 17 14:33:49 crc kubenswrapper[4762]: E0217 14:33:49.000263 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="init" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000271 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="init" Feb 17 14:33:49 crc kubenswrapper[4762]: E0217 14:33:49.000287 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="dnsmasq-dns" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000293 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="dnsmasq-dns" Feb 17 14:33:49 crc kubenswrapper[4762]: E0217 14:33:49.000304 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15862fc-7a11-484e-8343-c565ddcc60eb" containerName="nova-manage" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000310 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15862fc-7a11-484e-8343-c565ddcc60eb" containerName="nova-manage" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000537 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-api" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000550 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="017f582c-a428-4df1-85e2-955bd88c9b26" containerName="dnsmasq-dns" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000566 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" containerName="nova-api-log" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.000576 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15862fc-7a11-484e-8343-c565ddcc60eb" containerName="nova-manage" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.002107 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.004140 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.004508 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.005721 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.027877 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.057044 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-logs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.057132 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-config-data\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.057221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn7vv\" (UniqueName: \"kubernetes.io/projected/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-kube-api-access-rn7vv\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.057294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.057331 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.057517 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.158512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.158800 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.158946 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.158984 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-logs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.159036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-config-data\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.159098 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn7vv\" (UniqueName: \"kubernetes.io/projected/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-kube-api-access-rn7vv\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.160328 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-logs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.163710 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.164169 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.171240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.173237 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-config-data\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.178085 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn7vv\" (UniqueName: \"kubernetes.io/projected/ae89a58d-cd03-4c0c-8d74-a683f1d77bf3-kube-api-access-rn7vv\") pod \"nova-api-0\" (UID: \"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3\") " pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.387938 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.552980 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.678960 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-log-httpd\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.679030 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-sg-core-conf-yaml\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.679077 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-run-httpd\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.679131 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-combined-ca-bundle\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.679663 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.679770 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.680082 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-config-data\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.680106 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzddc\" (UniqueName: \"kubernetes.io/projected/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-kube-api-access-tzddc\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.680595 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-scripts\") pod \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\" (UID: \"74ffcd67-da9c-4fbf-8d49-6e70f05af26f\") " Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.681609 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.681628 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.685716 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-scripts" (OuterVolumeSpecName: "scripts") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.687584 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-kube-api-access-tzddc" (OuterVolumeSpecName: "kube-api-access-tzddc") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "kube-api-access-tzddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.835937 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzddc\" (UniqueName: \"kubernetes.io/projected/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-kube-api-access-tzddc\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.835975 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.870064 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.913424 4762 generic.go:334] "Generic (PLEG): container finished" podID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerID="4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9" exitCode=0 Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.913475 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerDied","Data":"4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9"} Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.913513 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74ffcd67-da9c-4fbf-8d49-6e70f05af26f","Type":"ContainerDied","Data":"53a4b3bd63f44293085152d9f7d4b85cea85461ca8a361004a85eadfc54b7fd9"} Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.913537 4762 scope.go:117] "RemoveContainer" containerID="cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.913911 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.943467 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.948294 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.960408 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.965941 4762 scope.go:117] "RemoveContainer" containerID="38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699" Feb 17 14:33:49 crc kubenswrapper[4762]: I0217 14:33:49.992014 4762 scope.go:117] "RemoveContainer" containerID="fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.011219 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-config-data" (OuterVolumeSpecName: "config-data") pod "74ffcd67-da9c-4fbf-8d49-6e70f05af26f" (UID: "74ffcd67-da9c-4fbf-8d49-6e70f05af26f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.026864 4762 scope.go:117] "RemoveContainer" containerID="4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.046674 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.046709 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ffcd67-da9c-4fbf-8d49-6e70f05af26f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.052331 4762 scope.go:117] "RemoveContainer" containerID="cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.054158 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876\": container with ID starting with cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876 not found: ID does not exist" containerID="cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.054203 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876"} err="failed to get container status \"cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876\": rpc error: code = NotFound desc = could not find container \"cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876\": container with ID starting with cfbd9a474589da09814bc142b2550ff9b591e553699334ff172571aa36809876 not found: ID does not exist" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.054242 4762 scope.go:117] "RemoveContainer" containerID="38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.054681 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699\": container with ID starting with 38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699 not found: ID does not exist" containerID="38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.054728 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699"} err="failed to get container status \"38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699\": rpc error: code = NotFound desc = could not find container \"38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699\": container with ID starting with 38c005fb4ede972752e9be806c898d98895c870ff3dea1ca5cd3a4f85d170699 not found: ID does not exist" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.054753 4762 scope.go:117] "RemoveContainer" containerID="fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.055385 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a\": container with ID starting with fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a not found: ID does not exist" containerID="fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.055411 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a"} err="failed to get container status \"fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a\": rpc error: code = NotFound desc = could not find container \"fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a\": container with ID starting with fbb1b335bb117f67616d3c15e7df9a5a3de4b7bdcd660b7f2259050a4078c21a not found: ID does not exist" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.055425 4762 scope.go:117] "RemoveContainer" containerID="4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.055676 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9\": container with ID starting with 4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9 not found: ID does not exist" containerID="4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.055709 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9"} err="failed to get container status \"4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9\": rpc error: code = NotFound desc = could not find container \"4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9\": container with ID starting with 4e8c64280e0fa23679386d76b483c283eb4c9ac82e0a981fc6d8615a32233ef9 not found: ID does not exist" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.094329 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b24e9c-a819-4791-b2d5-97e4e56a22c1" path="/var/lib/kubelet/pods/f6b24e9c-a819-4791-b2d5-97e4e56a22c1/volumes" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.208468 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443 is running failed: container process not found" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.208786 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443 is running failed: container process not found" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.209001 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443 is running failed: container process not found" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.209033 4762 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" containerName="nova-scheduler-scheduler" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.596175 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.253:8775/\": read tcp 10.217.0.2:42364->10.217.0.253:8775: read: connection reset by peer" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.596169 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.253:8775/\": read tcp 10.217.0.2:42368->10.217.0.253:8775: read: connection reset by peer" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.841928 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.848902 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.860746 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.871832 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.872595 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" containerName="nova-scheduler-scheduler" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.872616 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" containerName="nova-scheduler-scheduler" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.872635 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="proxy-httpd" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.872664 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="proxy-httpd" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.872677 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-central-agent" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.872686 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-central-agent" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.872710 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-notification-agent" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.872718 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-notification-agent" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.872736 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="sg-core" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.872743 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="sg-core" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.873024 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="proxy-httpd" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.873043 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-central-agent" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.873065 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="sg-core" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.873077 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" containerName="nova-scheduler-scheduler" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.873102 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" containerName="ceilometer-notification-agent" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.888790 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.891374 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.899039 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.901360 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.928232 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3","Type":"ContainerStarted","Data":"42c1b20d1d30ceecb4730d1bf1337797f3d7488667b28466a73ca641dc8d1a1f"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.928274 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3","Type":"ContainerStarted","Data":"8ac0c20fc0c3b71161c3b118ccd23c493556d957b29e9dc7bb076121d1f958e1"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.930191 4762 generic.go:334] "Generic (PLEG): container finished" podID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" exitCode=0 Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.930232 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfef0bfb-1f5e-4c74-b451-624612d99d6f","Type":"ContainerDied","Data":"00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.930248 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfef0bfb-1f5e-4c74-b451-624612d99d6f","Type":"ContainerDied","Data":"846e418f8241e923cd0905f061b4652273d506eb4374b354b28a2585ec2c0ea2"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.930265 4762 scope.go:117] "RemoveContainer" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.930354 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.935068 4762 generic.go:334] "Generic (PLEG): container finished" podID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerID="59773f5a9db93ad22b346d36f4b50875a85c9b2c4b699bcec80eb85aa725692e" exitCode=0 Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.935117 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6bb9f998-3134-4e4b-91ee-6ee679264798","Type":"ContainerDied","Data":"59773f5a9db93ad22b346d36f4b50875a85c9b2c4b699bcec80eb85aa725692e"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.940675 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfhs5\" (UniqueName: \"kubernetes.io/projected/dfef0bfb-1f5e-4c74-b451-624612d99d6f-kube-api-access-kfhs5\") pod \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.940948 4762 generic.go:334] "Generic (PLEG): container finished" podID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerID="25a310a6ea3a249f1ba5708296333d9999bd5e8c7e0b857a6864ec336c79f102" exitCode=137 Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941066 4762 generic.go:334] "Generic (PLEG): container finished" podID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerID="dc6c4a8ff8fae25315b467372ae51fb8f33bb19f086ced32b0839a20fe2f12e2" exitCode=137 Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941070 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-config-data\") pod \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941110 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-combined-ca-bundle\") pod \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\" (UID: \"dfef0bfb-1f5e-4c74-b451-624612d99d6f\") " Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941268 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerDied","Data":"25a310a6ea3a249f1ba5708296333d9999bd5e8c7e0b857a6864ec336c79f102"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941408 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerDied","Data":"dc6c4a8ff8fae25315b467372ae51fb8f33bb19f086ced32b0839a20fe2f12e2"} Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.941905 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-config-data\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.942018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hmt\" (UniqueName: \"kubernetes.io/projected/7702b544-101e-46ba-ab3c-03c3a94bd50d-kube-api-access-r5hmt\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.942329 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-log-httpd\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.942677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-run-httpd\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.942719 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-scripts\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.942743 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.946946 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfef0bfb-1f5e-4c74-b451-624612d99d6f-kube-api-access-kfhs5" (OuterVolumeSpecName: "kube-api-access-kfhs5") pod "dfef0bfb-1f5e-4c74-b451-624612d99d6f" (UID: "dfef0bfb-1f5e-4c74-b451-624612d99d6f"). InnerVolumeSpecName "kube-api-access-kfhs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.961897 4762 scope.go:117] "RemoveContainer" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" Feb 17 14:33:50 crc kubenswrapper[4762]: E0217 14:33:50.963196 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443\": container with ID starting with 00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443 not found: ID does not exist" containerID="00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.963242 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443"} err="failed to get container status \"00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443\": rpc error: code = NotFound desc = could not find container \"00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443\": container with ID starting with 00d02e878b65a9c9bb0013bba677340aa4426ea3b0b2acd9cc662815ef0af443 not found: ID does not exist" Feb 17 14:33:50 crc kubenswrapper[4762]: I0217 14:33:50.978159 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfef0bfb-1f5e-4c74-b451-624612d99d6f" (UID: "dfef0bfb-1f5e-4c74-b451-624612d99d6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.005590 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-config-data" (OuterVolumeSpecName: "config-data") pod "dfef0bfb-1f5e-4c74-b451-624612d99d6f" (UID: "dfef0bfb-1f5e-4c74-b451-624612d99d6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057453 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-log-httpd\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057598 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-run-httpd\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057635 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-scripts\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057736 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057894 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-config-data\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.057925 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hmt\" (UniqueName: \"kubernetes.io/projected/7702b544-101e-46ba-ab3c-03c3a94bd50d-kube-api-access-r5hmt\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.058068 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-log-httpd\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.058108 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.058128 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfef0bfb-1f5e-4c74-b451-624612d99d6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.058143 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfhs5\" (UniqueName: \"kubernetes.io/projected/dfef0bfb-1f5e-4c74-b451-624612d99d6f-kube-api-access-kfhs5\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.058107 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-run-httpd\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.271851 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.273275 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-config-data\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.275637 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-scripts\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.275434 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.276324 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.302854 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hmt\" (UniqueName: \"kubernetes.io/projected/7702b544-101e-46ba-ab3c-03c3a94bd50d-kube-api-access-r5hmt\") pod \"ceilometer-0\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.393111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-config-data\") pod \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.393514 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-scripts\") pod \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.404768 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-scripts" (OuterVolumeSpecName: "scripts") pod "78331bd2-6f9d-4613-ac62-672c89a6ea1b" (UID: "78331bd2-6f9d-4613-ac62-672c89a6ea1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.406816 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-combined-ca-bundle\") pod \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.406988 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bswdq\" (UniqueName: \"kubernetes.io/projected/78331bd2-6f9d-4613-ac62-672c89a6ea1b-kube-api-access-bswdq\") pod \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\" (UID: \"78331bd2-6f9d-4613-ac62-672c89a6ea1b\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.408899 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.420856 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78331bd2-6f9d-4613-ac62-672c89a6ea1b-kube-api-access-bswdq" (OuterVolumeSpecName: "kube-api-access-bswdq") pod "78331bd2-6f9d-4613-ac62-672c89a6ea1b" (UID: "78331bd2-6f9d-4613-ac62-672c89a6ea1b"). InnerVolumeSpecName "kube-api-access-bswdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.510475 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.511784 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bswdq\" (UniqueName: \"kubernetes.io/projected/78331bd2-6f9d-4613-ac62-672c89a6ea1b-kube-api-access-bswdq\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.573199 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.614472 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.614880 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-config-data" (OuterVolumeSpecName: "config-data") pod "78331bd2-6f9d-4613-ac62-672c89a6ea1b" (UID: "78331bd2-6f9d-4613-ac62-672c89a6ea1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.669744 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.707168 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:51 crc kubenswrapper[4762]: E0217 14:33:51.708051 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-listener" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.708151 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-listener" Feb 17 14:33:51 crc kubenswrapper[4762]: E0217 14:33:51.708286 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-metadata" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.708367 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-metadata" Feb 17 14:33:51 crc kubenswrapper[4762]: E0217 14:33:51.708452 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-api" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.708521 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-api" Feb 17 14:33:51 crc kubenswrapper[4762]: E0217 14:33:51.708607 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-log" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.708700 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-log" Feb 17 14:33:51 crc kubenswrapper[4762]: E0217 14:33:51.708798 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-evaluator" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.708867 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-evaluator" Feb 17 14:33:51 crc kubenswrapper[4762]: E0217 14:33:51.708965 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-notifier" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.709036 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-notifier" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.709444 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-log" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.709546 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-listener" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.715836 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" containerName="nova-metadata-metadata" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.716081 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-evaluator" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.716221 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-api" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.716307 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" containerName="aodh-notifier" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.721998 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb9f998-3134-4e4b-91ee-6ee679264798-logs\") pod \"6bb9f998-3134-4e4b-91ee-6ee679264798\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.722075 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-nova-metadata-tls-certs\") pod \"6bb9f998-3134-4e4b-91ee-6ee679264798\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.722154 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qxbr\" (UniqueName: \"kubernetes.io/projected/6bb9f998-3134-4e4b-91ee-6ee679264798-kube-api-access-4qxbr\") pod \"6bb9f998-3134-4e4b-91ee-6ee679264798\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.722190 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-config-data\") pod \"6bb9f998-3134-4e4b-91ee-6ee679264798\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.722205 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-combined-ca-bundle\") pod \"6bb9f998-3134-4e4b-91ee-6ee679264798\" (UID: \"6bb9f998-3134-4e4b-91ee-6ee679264798\") " Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.722781 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.723756 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb9f998-3134-4e4b-91ee-6ee679264798-logs" (OuterVolumeSpecName: "logs") pod "6bb9f998-3134-4e4b-91ee-6ee679264798" (UID: "6bb9f998-3134-4e4b-91ee-6ee679264798"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.729434 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.741111 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.741895 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78331bd2-6f9d-4613-ac62-672c89a6ea1b" (UID: "78331bd2-6f9d-4613-ac62-672c89a6ea1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.776561 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb9f998-3134-4e4b-91ee-6ee679264798-kube-api-access-4qxbr" (OuterVolumeSpecName: "kube-api-access-4qxbr") pod "6bb9f998-3134-4e4b-91ee-6ee679264798" (UID: "6bb9f998-3134-4e4b-91ee-6ee679264798"). InnerVolumeSpecName "kube-api-access-4qxbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:51 crc kubenswrapper[4762]: I0217 14:33:51.961935 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.002180 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2gkn\" (UniqueName: \"kubernetes.io/projected/0a0b2598-a78d-461c-bd60-6eca94aed9d9-kube-api-access-z2gkn\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.002354 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0b2598-a78d-461c-bd60-6eca94aed9d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.002470 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0b2598-a78d-461c-bd60-6eca94aed9d9-config-data\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.002886 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qxbr\" (UniqueName: \"kubernetes.io/projected/6bb9f998-3134-4e4b-91ee-6ee679264798-kube-api-access-4qxbr\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.002922 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78331bd2-6f9d-4613-ac62-672c89a6ea1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.002935 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb9f998-3134-4e4b-91ee-6ee679264798-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.056924 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae89a58d-cd03-4c0c-8d74-a683f1d77bf3","Type":"ContainerStarted","Data":"5b825a08b4cd4a1ac5703c0df813f1cd5a55eb76025494d074264012feb960fd"} Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.061288 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bb9f998-3134-4e4b-91ee-6ee679264798" (UID: "6bb9f998-3134-4e4b-91ee-6ee679264798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.109179 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0b2598-a78d-461c-bd60-6eca94aed9d9-config-data\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.109339 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gkn\" (UniqueName: \"kubernetes.io/projected/0a0b2598-a78d-461c-bd60-6eca94aed9d9-kube-api-access-z2gkn\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.109552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0b2598-a78d-461c-bd60-6eca94aed9d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.109672 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.110276 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-config-data" (OuterVolumeSpecName: "config-data") pod "6bb9f998-3134-4e4b-91ee-6ee679264798" (UID: "6bb9f998-3134-4e4b-91ee-6ee679264798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.122676 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ffcd67-da9c-4fbf-8d49-6e70f05af26f" path="/var/lib/kubelet/pods/74ffcd67-da9c-4fbf-8d49-6e70f05af26f/volumes" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.123487 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0b2598-a78d-461c-bd60-6eca94aed9d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.126978 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.137319 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfef0bfb-1f5e-4c74-b451-624612d99d6f" path="/var/lib/kubelet/pods/dfef0bfb-1f5e-4c74-b451-624612d99d6f/volumes" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.145284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0b2598-a78d-461c-bd60-6eca94aed9d9-config-data\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.159927 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2gkn\" (UniqueName: \"kubernetes.io/projected/0a0b2598-a78d-461c-bd60-6eca94aed9d9-kube-api-access-z2gkn\") pod \"nova-scheduler-0\" (UID: \"0a0b2598-a78d-461c-bd60-6eca94aed9d9\") " pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.176979 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6bb9f998-3134-4e4b-91ee-6ee679264798","Type":"ContainerDied","Data":"0f059d172921ce2b383b4c866b68d8981d326ef4dfa08b7f3b63c0b7f9285426"} Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.177034 4762 scope.go:117] "RemoveContainer" containerID="59773f5a9db93ad22b346d36f4b50875a85c9b2c4b699bcec80eb85aa725692e" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.180982 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.180963322 podStartE2EDuration="4.180963322s" podCreationTimestamp="2026-02-17 14:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:52.119135275 +0000 UTC m=+1712.699135947" watchObservedRunningTime="2026-02-17 14:33:52.180963322 +0000 UTC m=+1712.760963974" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.194436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"78331bd2-6f9d-4613-ac62-672c89a6ea1b","Type":"ContainerDied","Data":"9308474c37c5a8ca5541eac43b1e7794910c70dd81935ec11b5856cfd5055da9"} Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.194575 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.218044 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.246321 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.251859 4762 scope.go:117] "RemoveContainer" containerID="e516c2d595f01a19af1b3b7531bf2bd3e4520e05d113cb97d33cbdbed416b182" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.269768 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.280888 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6bb9f998-3134-4e4b-91ee-6ee679264798" (UID: "6bb9f998-3134-4e4b-91ee-6ee679264798"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.292275 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.300308 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.303971 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.304267 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.304854 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.305489 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.306746 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xczfd" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.307224 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.307395 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.322514 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkp2c\" (UniqueName: \"kubernetes.io/projected/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-kube-api-access-mkp2c\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.322701 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-scripts\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.322759 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-public-tls-certs\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.322844 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-config-data\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.323010 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.323075 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-internal-tls-certs\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.323237 4762 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb9f998-3134-4e4b-91ee-6ee679264798-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.574798 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.574862 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-internal-tls-certs\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.574970 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkp2c\" (UniqueName: \"kubernetes.io/projected/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-kube-api-access-mkp2c\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.575028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-scripts\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.575055 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-public-tls-certs\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.575102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-config-data\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.587309 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-config-data\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.590698 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-scripts\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.591075 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-internal-tls-certs\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.594269 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.606761 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-public-tls-certs\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.623261 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkp2c\" (UniqueName: \"kubernetes.io/projected/55524ce8-1fb2-4a0c-ad16-e6ba37940c0a-kube-api-access-mkp2c\") pod \"aodh-0\" (UID: \"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a\") " pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.678122 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.733681 4762 scope.go:117] "RemoveContainer" containerID="25a310a6ea3a249f1ba5708296333d9999bd5e8c7e0b857a6864ec336c79f102" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.734251 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.746210 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.774904 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.807430 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.811222 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.815999 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.821156 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.838975 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.849408 4762 scope.go:117] "RemoveContainer" containerID="dc6c4a8ff8fae25315b467372ae51fb8f33bb19f086ced32b0839a20fe2f12e2" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.897584 4762 scope.go:117] "RemoveContainer" containerID="bfaa8ab977c5183d16ee6888a20a0627d6e28f847b814137b81448edd19e2403" Feb 17 14:33:52 crc kubenswrapper[4762]: I0217 14:33:52.946688 4762 scope.go:117] "RemoveContainer" containerID="00d51b9c5984bae0b8d24c37dc4dfe0832ed42d5c56b0c95a5ee8bd82342d8e1" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.151890 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfq8\" (UniqueName: \"kubernetes.io/projected/338b2e6a-3e06-422f-8e9b-917735470caa-kube-api-access-brfq8\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.152226 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-config-data\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.152360 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.152445 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338b2e6a-3e06-422f-8e9b-917735470caa-logs\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.152506 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.263555 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.263721 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338b2e6a-3e06-422f-8e9b-917735470caa-logs\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.263836 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.264075 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brfq8\" (UniqueName: \"kubernetes.io/projected/338b2e6a-3e06-422f-8e9b-917735470caa-kube-api-access-brfq8\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.264123 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-config-data\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.265161 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338b2e6a-3e06-422f-8e9b-917735470caa-logs\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.272317 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.272577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.282949 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerStarted","Data":"90b7e9d15ecebd7aa0cdc60a8618b6c9a6e4696c14d6adb3490fedcb238f7b51"} Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.284758 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.285497 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338b2e6a-3e06-422f-8e9b-917735470caa-config-data\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.299378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfq8\" (UniqueName: \"kubernetes.io/projected/338b2e6a-3e06-422f-8e9b-917735470caa-kube-api-access-brfq8\") pod \"nova-metadata-0\" (UID: \"338b2e6a-3e06-422f-8e9b-917735470caa\") " pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.444175 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:33:53 crc kubenswrapper[4762]: I0217 14:33:53.557281 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.094296 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb9f998-3134-4e4b-91ee-6ee679264798" path="/var/lib/kubelet/pods/6bb9f998-3134-4e4b-91ee-6ee679264798/volumes" Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.095449 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78331bd2-6f9d-4613-ac62-672c89a6ea1b" path="/var/lib/kubelet/pods/78331bd2-6f9d-4613-ac62-672c89a6ea1b/volumes" Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.193148 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.314553 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerStarted","Data":"1e7a95463ae41c449711f79d70645c32c01f6a6ea9dfba9c938671ad754bbe77"} Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.320634 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a","Type":"ContainerStarted","Data":"adf81dfaadb7d5f75cc7da60dd8d5824ba9a9bfe30376e8a3e937584769931a0"} Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.328174 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0a0b2598-a78d-461c-bd60-6eca94aed9d9","Type":"ContainerStarted","Data":"eb507c2e8d5c75468d2d9d77374c58ecf76d26d27453e1d2b8af9ec73df90234"} Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.328221 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0a0b2598-a78d-461c-bd60-6eca94aed9d9","Type":"ContainerStarted","Data":"38933e781c1a28b86e8be45cdea12720ea89f4ba341f34029dd042ba5f82ed26"} Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.331099 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338b2e6a-3e06-422f-8e9b-917735470caa","Type":"ContainerStarted","Data":"69b6145a46ddb0b2be452dbad51a7c44be89277dbe9d4320fa713e91582d2b37"} Feb 17 14:33:54 crc kubenswrapper[4762]: I0217 14:33:54.360982 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.360953448 podStartE2EDuration="3.360953448s" podCreationTimestamp="2026-02-17 14:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:54.356915589 +0000 UTC m=+1714.936916241" watchObservedRunningTime="2026-02-17 14:33:54.360953448 +0000 UTC m=+1714.940954100" Feb 17 14:33:55 crc kubenswrapper[4762]: I0217 14:33:55.412520 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerStarted","Data":"25838f1adb278d9a3ff37d5a9be3807e3f530a45fd667afa10153deb9545bbdb"} Feb 17 14:33:55 crc kubenswrapper[4762]: I0217 14:33:55.416323 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a","Type":"ContainerStarted","Data":"4d4120c83513141e6a38093f16c3bb6bbf78df602343b59f4e9081c2b487a6f6"} Feb 17 14:33:55 crc kubenswrapper[4762]: I0217 14:33:55.416381 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a","Type":"ContainerStarted","Data":"56f730bd8db2ef7eeaee5e4994c18b0cdcb309802de492479be66181075a8da2"} Feb 17 14:33:55 crc kubenswrapper[4762]: I0217 14:33:55.438266 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338b2e6a-3e06-422f-8e9b-917735470caa","Type":"ContainerStarted","Data":"76018f0bec698a2c0a577f5372be4ead15e6410acc5275773b5f3a4e0f543a6b"} Feb 17 14:33:55 crc kubenswrapper[4762]: I0217 14:33:55.438313 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338b2e6a-3e06-422f-8e9b-917735470caa","Type":"ContainerStarted","Data":"1a1258b2118e6bc99aa45fc488f25050fb7d4d6080976a5a204676e0565765d0"} Feb 17 14:33:55 crc kubenswrapper[4762]: I0217 14:33:55.488013 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.487957681 podStartE2EDuration="3.487957681s" podCreationTimestamp="2026-02-17 14:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:33:55.48054475 +0000 UTC m=+1716.060545402" watchObservedRunningTime="2026-02-17 14:33:55.487957681 +0000 UTC m=+1716.067958333" Feb 17 14:33:56 crc kubenswrapper[4762]: I0217 14:33:56.712196 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a","Type":"ContainerStarted","Data":"85c2c9e3898c4ca22c90cf393f36675c844ef94f492f60613689d8abf6d01e60"} Feb 17 14:33:56 crc kubenswrapper[4762]: I0217 14:33:56.722753 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerStarted","Data":"211642233b8cec5a8f6a40bfa9689bd70558d61af6201b8493737d021fa7964e"} Feb 17 14:33:57 crc kubenswrapper[4762]: I0217 14:33:57.373297 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 14:33:57 crc kubenswrapper[4762]: I0217 14:33:57.736753 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"55524ce8-1fb2-4a0c-ad16-e6ba37940c0a","Type":"ContainerStarted","Data":"9bbf483877981a92d38eb0b21a2d3f041bfcedbb95b1fd9b51a3fc2ef6fda057"} Feb 17 14:33:57 crc kubenswrapper[4762]: I0217 14:33:57.778768 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.61889891 podStartE2EDuration="5.778739704s" podCreationTimestamp="2026-02-17 14:33:52 +0000 UTC" firstStartedPulling="2026-02-17 14:33:53.622576395 +0000 UTC m=+1714.202577047" lastFinishedPulling="2026-02-17 14:33:56.782417189 +0000 UTC m=+1717.362417841" observedRunningTime="2026-02-17 14:33:57.759709397 +0000 UTC m=+1718.339710069" watchObservedRunningTime="2026-02-17 14:33:57.778739704 +0000 UTC m=+1718.358740356" Feb 17 14:33:58 crc kubenswrapper[4762]: I0217 14:33:58.759785 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:33:58 crc kubenswrapper[4762]: I0217 14:33:58.760964 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:33:59 crc kubenswrapper[4762]: I0217 14:33:59.301683 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:33:59 crc kubenswrapper[4762]: E0217 14:33:59.302375 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:33:59 crc kubenswrapper[4762]: I0217 14:33:59.388180 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:33:59 crc kubenswrapper[4762]: I0217 14:33:59.388228 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:34:00 crc kubenswrapper[4762]: I0217 14:34:00.002739 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerStarted","Data":"176af4d5efd34f5cee2fa9e778fb1ed9ff4c13ef57b5e2a346d45fc12719cbaf"} Feb 17 14:34:00 crc kubenswrapper[4762]: I0217 14:34:00.003334 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:34:00 crc kubenswrapper[4762]: I0217 14:34:00.041999 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.567367261 podStartE2EDuration="10.041973569s" podCreationTimestamp="2026-02-17 14:33:50 +0000 UTC" firstStartedPulling="2026-02-17 14:33:52.690609177 +0000 UTC m=+1713.270609829" lastFinishedPulling="2026-02-17 14:33:58.165215485 +0000 UTC m=+1718.745216137" observedRunningTime="2026-02-17 14:34:00.029106359 +0000 UTC m=+1720.609107011" watchObservedRunningTime="2026-02-17 14:34:00.041973569 +0000 UTC m=+1720.621974221" Feb 17 14:34:00 crc kubenswrapper[4762]: I0217 14:34:00.559069 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae89a58d-cd03-4c0c-8d74-a683f1d77bf3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:34:00 crc kubenswrapper[4762]: I0217 14:34:00.559932 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae89a58d-cd03-4c0c-8d74-a683f1d77bf3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:34:02 crc kubenswrapper[4762]: I0217 14:34:02.305266 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 14:34:02 crc kubenswrapper[4762]: I0217 14:34:02.651936 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 14:34:03 crc kubenswrapper[4762]: I0217 14:34:03.444881 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:34:03 crc kubenswrapper[4762]: I0217 14:34:03.444932 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:34:03 crc kubenswrapper[4762]: I0217 14:34:03.687568 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 14:34:04 crc kubenswrapper[4762]: I0217 14:34:04.704062 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="338b2e6a-3e06-422f-8e9b-917735470caa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.11:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:34:04 crc kubenswrapper[4762]: I0217 14:34:04.706823 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="338b2e6a-3e06-422f-8e9b-917735470caa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.11:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:34:09 crc kubenswrapper[4762]: I0217 14:34:09.394410 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:34:09 crc kubenswrapper[4762]: I0217 14:34:09.395889 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:34:09 crc kubenswrapper[4762]: I0217 14:34:09.402215 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:34:09 crc kubenswrapper[4762]: I0217 14:34:09.403872 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:34:09 crc kubenswrapper[4762]: I0217 14:34:09.760551 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:34:09 crc kubenswrapper[4762]: I0217 14:34:09.775094 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:34:10 crc kubenswrapper[4762]: I0217 14:34:10.088846 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:34:10 crc kubenswrapper[4762]: E0217 14:34:10.090122 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:34:13 crc kubenswrapper[4762]: I0217 14:34:13.451199 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:34:13 crc kubenswrapper[4762]: I0217 14:34:13.454169 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:34:13 crc kubenswrapper[4762]: I0217 14:34:13.465367 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:34:13 crc kubenswrapper[4762]: I0217 14:34:13.815000 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:34:20 crc kubenswrapper[4762]: I0217 14:34:20.661402 4762 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod74ffcd67-da9c-4fbf-8d49-6e70f05af26f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod74ffcd67-da9c-4fbf-8d49-6e70f05af26f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod74ffcd67_da9c_4fbf_8d49_6e70f05af26f.slice" Feb 17 14:34:21 crc kubenswrapper[4762]: I0217 14:34:21.523452 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 14:34:23 crc kubenswrapper[4762]: I0217 14:34:23.114788 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:34:23 crc kubenswrapper[4762]: E0217 14:34:23.115978 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.166304 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.167137 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6d19ed64-87e9-4afd-9c02-4319baed9bda" containerName="kube-state-metrics" containerID="cri-o://8d3fbee898bdd4c5f8b01484c224574c540d666bff1c4ba85cf0894b8064fa05" gracePeriod=30 Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.279484 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.280190 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" containerName="mysqld-exporter" containerID="cri-o://16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b" gracePeriod=30 Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.325743 4762 generic.go:334] "Generic (PLEG): container finished" podID="6d19ed64-87e9-4afd-9c02-4319baed9bda" containerID="8d3fbee898bdd4c5f8b01484c224574c540d666bff1c4ba85cf0894b8064fa05" exitCode=2 Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.325833 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6d19ed64-87e9-4afd-9c02-4319baed9bda","Type":"ContainerDied","Data":"8d3fbee898bdd4c5f8b01484c224574c540d666bff1c4ba85cf0894b8064fa05"} Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.808996 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.930339 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9967c\" (UniqueName: \"kubernetes.io/projected/6d19ed64-87e9-4afd-9c02-4319baed9bda-kube-api-access-9967c\") pod \"6d19ed64-87e9-4afd-9c02-4319baed9bda\" (UID: \"6d19ed64-87e9-4afd-9c02-4319baed9bda\") " Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.942081 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d19ed64-87e9-4afd-9c02-4319baed9bda-kube-api-access-9967c" (OuterVolumeSpecName: "kube-api-access-9967c") pod "6d19ed64-87e9-4afd-9c02-4319baed9bda" (UID: "6d19ed64-87e9-4afd-9c02-4319baed9bda"). InnerVolumeSpecName "kube-api-access-9967c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:34:28 crc kubenswrapper[4762]: I0217 14:34:28.957165 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.033031 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9967c\" (UniqueName: \"kubernetes.io/projected/6d19ed64-87e9-4afd-9c02-4319baed9bda-kube-api-access-9967c\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.134214 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-655dk\" (UniqueName: \"kubernetes.io/projected/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-kube-api-access-655dk\") pod \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.134473 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-config-data\") pod \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.134604 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-combined-ca-bundle\") pod \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\" (UID: \"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2\") " Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.137768 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-kube-api-access-655dk" (OuterVolumeSpecName: "kube-api-access-655dk") pod "ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" (UID: "ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2"). InnerVolumeSpecName "kube-api-access-655dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.177476 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" (UID: "ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.203588 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-config-data" (OuterVolumeSpecName: "config-data") pod "ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" (UID: "ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.239685 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.239722 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-655dk\" (UniqueName: \"kubernetes.io/projected/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-kube-api-access-655dk\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.239736 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.341627 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6d19ed64-87e9-4afd-9c02-4319baed9bda","Type":"ContainerDied","Data":"1df58b4fd92738c11d81716ff930e671f339de9e1442edaa30e82ee552ff13dc"} Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.341951 4762 scope.go:117] "RemoveContainer" containerID="8d3fbee898bdd4c5f8b01484c224574c540d666bff1c4ba85cf0894b8064fa05" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.341875 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.344547 4762 generic.go:334] "Generic (PLEG): container finished" podID="ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" containerID="16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b" exitCode=2 Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.344593 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2","Type":"ContainerDied","Data":"16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b"} Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.344608 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.344622 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2","Type":"ContainerDied","Data":"ae9451183557f75a2b0627cf76735216c702a245f2b97d42a3464d54f14ea026"} Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.395630 4762 scope.go:117] "RemoveContainer" containerID="16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.424414 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.438169 4762 scope.go:117] "RemoveContainer" containerID="16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b" Feb 17 14:34:29 crc kubenswrapper[4762]: E0217 14:34:29.438722 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b\": container with ID starting with 16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b not found: ID does not exist" containerID="16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.438783 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b"} err="failed to get container status \"16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b\": rpc error: code = NotFound desc = could not find container \"16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b\": container with ID starting with 16cf48ff1adeccae542efe150820351310b30eeab76a682aa7f887e6ca130c6b not found: ID does not exist" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.490844 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.512799 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.527784 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.544726 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: E0217 14:34:29.545324 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" containerName="mysqld-exporter" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.545352 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" containerName="mysqld-exporter" Feb 17 14:34:29 crc kubenswrapper[4762]: E0217 14:34:29.545384 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d19ed64-87e9-4afd-9c02-4319baed9bda" containerName="kube-state-metrics" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.545392 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d19ed64-87e9-4afd-9c02-4319baed9bda" containerName="kube-state-metrics" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.545727 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" containerName="mysqld-exporter" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.545758 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d19ed64-87e9-4afd-9c02-4319baed9bda" containerName="kube-state-metrics" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.548601 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.553164 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.553360 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.559078 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-config-data\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.559125 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.559144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kld8l\" (UniqueName: \"kubernetes.io/projected/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-kube-api-access-kld8l\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.559227 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.572236 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.574270 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.576894 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.577088 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.598172 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.614602 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.660932 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661031 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661164 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661187 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nhm5\" (UniqueName: \"kubernetes.io/projected/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-api-access-7nhm5\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661228 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-config-data\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661255 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661276 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.661292 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kld8l\" (UniqueName: \"kubernetes.io/projected/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-kube-api-access-kld8l\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.673007 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.674461 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-config-data\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.679917 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.680799 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kld8l\" (UniqueName: \"kubernetes.io/projected/ceb5a67d-f2f8-4d60-b90e-8cc0c3599146-kube-api-access-kld8l\") pod \"mysqld-exporter-0\" (UID: \"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146\") " pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.763562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.763677 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.763917 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.763956 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nhm5\" (UniqueName: \"kubernetes.io/projected/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-api-access-7nhm5\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.767866 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.768612 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.769553 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.781700 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nhm5\" (UniqueName: \"kubernetes.io/projected/398ab3b8-a4d9-48fd-9236-9e7aed43e7d9-kube-api-access-7nhm5\") pod \"kube-state-metrics-0\" (UID: \"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9\") " pod="openstack/kube-state-metrics-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.868268 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 17 14:34:29 crc kubenswrapper[4762]: I0217 14:34:29.896070 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.136766 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d19ed64-87e9-4afd-9c02-4319baed9bda" path="/var/lib/kubelet/pods/6d19ed64-87e9-4afd-9c02-4319baed9bda/volumes" Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.139128 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2" path="/var/lib/kubelet/pods/ee9b9ac0-7ac0-421a-a94d-8b25a433e7e2/volumes" Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.484717 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.706302 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.934119 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.934426 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-central-agent" containerID="cri-o://1e7a95463ae41c449711f79d70645c32c01f6a6ea9dfba9c938671ad754bbe77" gracePeriod=30 Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.935016 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="proxy-httpd" containerID="cri-o://176af4d5efd34f5cee2fa9e778fb1ed9ff4c13ef57b5e2a346d45fc12719cbaf" gracePeriod=30 Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.935077 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="sg-core" containerID="cri-o://211642233b8cec5a8f6a40bfa9689bd70558d61af6201b8493737d021fa7964e" gracePeriod=30 Feb 17 14:34:30 crc kubenswrapper[4762]: I0217 14:34:30.935120 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-notification-agent" containerID="cri-o://25838f1adb278d9a3ff37d5a9be3807e3f530a45fd667afa10153deb9545bbdb" gracePeriod=30 Feb 17 14:34:31 crc kubenswrapper[4762]: I0217 14:34:31.380344 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9","Type":"ContainerStarted","Data":"0dbe604abfc7384c45507109d5ec06bf94dfd61787b05d397e3a39998c4132a5"} Feb 17 14:34:31 crc kubenswrapper[4762]: I0217 14:34:31.382934 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146","Type":"ContainerStarted","Data":"bbc544bdf391c9017b06d730982bcd4213ac1c9496e9cb42667088f8f94740d2"} Feb 17 14:34:31 crc kubenswrapper[4762]: I0217 14:34:31.387723 4762 generic.go:334] "Generic (PLEG): container finished" podID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerID="176af4d5efd34f5cee2fa9e778fb1ed9ff4c13ef57b5e2a346d45fc12719cbaf" exitCode=0 Feb 17 14:34:31 crc kubenswrapper[4762]: I0217 14:34:31.387759 4762 generic.go:334] "Generic (PLEG): container finished" podID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerID="211642233b8cec5a8f6a40bfa9689bd70558d61af6201b8493737d021fa7964e" exitCode=2 Feb 17 14:34:31 crc kubenswrapper[4762]: I0217 14:34:31.387765 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerDied","Data":"176af4d5efd34f5cee2fa9e778fb1ed9ff4c13ef57b5e2a346d45fc12719cbaf"} Feb 17 14:34:31 crc kubenswrapper[4762]: I0217 14:34:31.387803 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerDied","Data":"211642233b8cec5a8f6a40bfa9689bd70558d61af6201b8493737d021fa7964e"} Feb 17 14:34:32 crc kubenswrapper[4762]: I0217 14:34:32.400336 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"398ab3b8-a4d9-48fd-9236-9e7aed43e7d9","Type":"ContainerStarted","Data":"af8a7acd7c3e6d0e7513c94ac785c0c4c0963d854bd301d44d0bc55abce7d433"} Feb 17 14:34:32 crc kubenswrapper[4762]: I0217 14:34:32.402547 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ceb5a67d-f2f8-4d60-b90e-8cc0c3599146","Type":"ContainerStarted","Data":"cdb7776c7b696618874b840ac3ae0235059642ecfdd6d2353600cb5d05ef4842"} Feb 17 14:34:32 crc kubenswrapper[4762]: I0217 14:34:32.406434 4762 generic.go:334] "Generic (PLEG): container finished" podID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerID="1e7a95463ae41c449711f79d70645c32c01f6a6ea9dfba9c938671ad754bbe77" exitCode=0 Feb 17 14:34:32 crc kubenswrapper[4762]: I0217 14:34:32.406492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerDied","Data":"1e7a95463ae41c449711f79d70645c32c01f6a6ea9dfba9c938671ad754bbe77"} Feb 17 14:34:32 crc kubenswrapper[4762]: I0217 14:34:32.441048 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.98365043 podStartE2EDuration="3.441016566s" podCreationTimestamp="2026-02-17 14:34:29 +0000 UTC" firstStartedPulling="2026-02-17 14:34:30.714436938 +0000 UTC m=+1751.294437590" lastFinishedPulling="2026-02-17 14:34:31.171803074 +0000 UTC m=+1751.751803726" observedRunningTime="2026-02-17 14:34:32.417670762 +0000 UTC m=+1752.997671414" watchObservedRunningTime="2026-02-17 14:34:32.441016566 +0000 UTC m=+1753.021017218" Feb 17 14:34:32 crc kubenswrapper[4762]: I0217 14:34:32.449806 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.874432926 podStartE2EDuration="3.449780884s" podCreationTimestamp="2026-02-17 14:34:29 +0000 UTC" firstStartedPulling="2026-02-17 14:34:30.481415623 +0000 UTC m=+1751.061416275" lastFinishedPulling="2026-02-17 14:34:31.056763581 +0000 UTC m=+1751.636764233" observedRunningTime="2026-02-17 14:34:32.448593361 +0000 UTC m=+1753.028594013" watchObservedRunningTime="2026-02-17 14:34:32.449780884 +0000 UTC m=+1753.029781536" Feb 17 14:34:33 crc kubenswrapper[4762]: I0217 14:34:33.417307 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.451887 4762 generic.go:334] "Generic (PLEG): container finished" podID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerID="25838f1adb278d9a3ff37d5a9be3807e3f530a45fd667afa10153deb9545bbdb" exitCode=0 Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.451935 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerDied","Data":"25838f1adb278d9a3ff37d5a9be3807e3f530a45fd667afa10153deb9545bbdb"} Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.563502 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669142 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-combined-ca-bundle\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669216 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-config-data\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669313 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-log-httpd\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669427 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-run-httpd\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669555 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-sg-core-conf-yaml\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669593 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5hmt\" (UniqueName: \"kubernetes.io/projected/7702b544-101e-46ba-ab3c-03c3a94bd50d-kube-api-access-r5hmt\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.669687 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-scripts\") pod \"7702b544-101e-46ba-ab3c-03c3a94bd50d\" (UID: \"7702b544-101e-46ba-ab3c-03c3a94bd50d\") " Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.670490 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.670920 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.674535 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7702b544-101e-46ba-ab3c-03c3a94bd50d-kube-api-access-r5hmt" (OuterVolumeSpecName: "kube-api-access-r5hmt") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "kube-api-access-r5hmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.676013 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-scripts" (OuterVolumeSpecName: "scripts") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.708493 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.773929 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.773968 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5hmt\" (UniqueName: \"kubernetes.io/projected/7702b544-101e-46ba-ab3c-03c3a94bd50d-kube-api-access-r5hmt\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.773981 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.773993 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.774004 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7702b544-101e-46ba-ab3c-03c3a94bd50d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.774533 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.806531 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-config-data" (OuterVolumeSpecName: "config-data") pod "7702b544-101e-46ba-ab3c-03c3a94bd50d" (UID: "7702b544-101e-46ba-ab3c-03c3a94bd50d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.876069 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:35 crc kubenswrapper[4762]: I0217 14:34:35.876106 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7702b544-101e-46ba-ab3c-03c3a94bd50d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.464930 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7702b544-101e-46ba-ab3c-03c3a94bd50d","Type":"ContainerDied","Data":"90b7e9d15ecebd7aa0cdc60a8618b6c9a6e4696c14d6adb3490fedcb238f7b51"} Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.464983 4762 scope.go:117] "RemoveContainer" containerID="176af4d5efd34f5cee2fa9e778fb1ed9ff4c13ef57b5e2a346d45fc12719cbaf" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.465004 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.495136 4762 scope.go:117] "RemoveContainer" containerID="211642233b8cec5a8f6a40bfa9689bd70558d61af6201b8493737d021fa7964e" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.503433 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.517458 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.519724 4762 scope.go:117] "RemoveContainer" containerID="25838f1adb278d9a3ff37d5a9be3807e3f530a45fd667afa10153deb9545bbdb" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.528933 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:34:36 crc kubenswrapper[4762]: E0217 14:34:36.529478 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-notification-agent" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529497 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-notification-agent" Feb 17 14:34:36 crc kubenswrapper[4762]: E0217 14:34:36.529516 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-central-agent" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529523 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-central-agent" Feb 17 14:34:36 crc kubenswrapper[4762]: E0217 14:34:36.529553 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="proxy-httpd" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529560 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="proxy-httpd" Feb 17 14:34:36 crc kubenswrapper[4762]: E0217 14:34:36.529578 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="sg-core" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529584 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="sg-core" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529815 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-central-agent" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529831 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="sg-core" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529855 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="ceilometer-notification-agent" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.529866 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" containerName="proxy-httpd" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.532377 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.535141 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.535156 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.535834 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.553019 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.565422 4762 scope.go:117] "RemoveContainer" containerID="1e7a95463ae41c449711f79d70645c32c01f6a6ea9dfba9c938671ad754bbe77" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.596842 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-scripts\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.596923 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-config-data\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.597007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b6239e-147b-429a-8765-dce18c23d63b-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.597125 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.597248 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b6239e-147b-429a-8765-dce18c23d63b-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.597314 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m2xd\" (UniqueName: \"kubernetes.io/projected/f1b6239e-147b-429a-8765-dce18c23d63b-kube-api-access-7m2xd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.597352 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.597563 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.789794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b6239e-147b-429a-8765-dce18c23d63b-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790464 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m2xd\" (UniqueName: \"kubernetes.io/projected/f1b6239e-147b-429a-8765-dce18c23d63b-kube-api-access-7m2xd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790503 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790624 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790757 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-scripts\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-config-data\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790869 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b6239e-147b-429a-8765-dce18c23d63b-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.791001 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.799573 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-config-data\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.790301 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b6239e-147b-429a-8765-dce18c23d63b-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.800175 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-scripts\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.800304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b6239e-147b-429a-8765-dce18c23d63b-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.801051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.802313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.803086 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b6239e-147b-429a-8765-dce18c23d63b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.816971 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m2xd\" (UniqueName: \"kubernetes.io/projected/f1b6239e-147b-429a-8765-dce18c23d63b-kube-api-access-7m2xd\") pod \"ceilometer-0\" (UID: \"f1b6239e-147b-429a-8765-dce18c23d63b\") " pod="openstack/ceilometer-0" Feb 17 14:34:36 crc kubenswrapper[4762]: I0217 14:34:36.864006 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:34:37 crc kubenswrapper[4762]: I0217 14:34:37.072073 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:34:37 crc kubenswrapper[4762]: E0217 14:34:37.072577 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:34:37 crc kubenswrapper[4762]: I0217 14:34:37.357024 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:34:37 crc kubenswrapper[4762]: I0217 14:34:37.477407 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b6239e-147b-429a-8765-dce18c23d63b","Type":"ContainerStarted","Data":"bdbd5b196127d73f3509c0e2a637574df72f184f547513bbd5e3b3b0768e425b"} Feb 17 14:34:38 crc kubenswrapper[4762]: I0217 14:34:38.083133 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7702b544-101e-46ba-ab3c-03c3a94bd50d" path="/var/lib/kubelet/pods/7702b544-101e-46ba-ab3c-03c3a94bd50d/volumes" Feb 17 14:34:38 crc kubenswrapper[4762]: I0217 14:34:38.514680 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b6239e-147b-429a-8765-dce18c23d63b","Type":"ContainerStarted","Data":"4f0be96933fe8621427b8c92f75dc40dba67894383f5ca8e19be87549db6cfc0"} Feb 17 14:34:39 crc kubenswrapper[4762]: I0217 14:34:39.551450 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b6239e-147b-429a-8765-dce18c23d63b","Type":"ContainerStarted","Data":"87a199d433e85661d7069abeeb0d2e802a3d3eff5234308b5ff2241fb120a147"} Feb 17 14:34:39 crc kubenswrapper[4762]: I0217 14:34:39.909015 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 14:34:40 crc kubenswrapper[4762]: I0217 14:34:40.715871 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b6239e-147b-429a-8765-dce18c23d63b","Type":"ContainerStarted","Data":"fbc0170d267ad8fa3ac84cfa7593d8a14b98b9c851f5b81534461c42ac5bc43e"} Feb 17 14:34:42 crc kubenswrapper[4762]: I0217 14:34:42.740459 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b6239e-147b-429a-8765-dce18c23d63b","Type":"ContainerStarted","Data":"740aa4a2c69c9a4f185667d5310e494cf6a75f6597e4c25c2a47049395406a1d"} Feb 17 14:34:42 crc kubenswrapper[4762]: I0217 14:34:42.741087 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:34:42 crc kubenswrapper[4762]: I0217 14:34:42.769225 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.786277067 podStartE2EDuration="6.769203213s" podCreationTimestamp="2026-02-17 14:34:36 +0000 UTC" firstStartedPulling="2026-02-17 14:34:37.381947147 +0000 UTC m=+1757.961947799" lastFinishedPulling="2026-02-17 14:34:41.364873293 +0000 UTC m=+1761.944873945" observedRunningTime="2026-02-17 14:34:42.768051742 +0000 UTC m=+1763.348052394" watchObservedRunningTime="2026-02-17 14:34:42.769203213 +0000 UTC m=+1763.349203865" Feb 17 14:34:48 crc kubenswrapper[4762]: I0217 14:34:48.071895 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:34:48 crc kubenswrapper[4762]: E0217 14:34:48.073006 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:35:00 crc kubenswrapper[4762]: I0217 14:35:00.082242 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:35:00 crc kubenswrapper[4762]: E0217 14:35:00.083140 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.824248 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xsj4g/must-gather-xb8ps"] Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.831492 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.835135 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xsj4g"/"openshift-service-ca.crt" Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.838443 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xsj4g"/"kube-root-ca.crt" Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.858273 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xsj4g/must-gather-xb8ps"] Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.936624 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.972748 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7gnl\" (UniqueName: \"kubernetes.io/projected/8bfff96d-6c90-4a80-9024-7539e414a009-kube-api-access-v7gnl\") pod \"must-gather-xb8ps\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:06 crc kubenswrapper[4762]: I0217 14:35:06.974221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8bfff96d-6c90-4a80-9024-7539e414a009-must-gather-output\") pod \"must-gather-xb8ps\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:07 crc kubenswrapper[4762]: I0217 14:35:07.076972 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7gnl\" (UniqueName: \"kubernetes.io/projected/8bfff96d-6c90-4a80-9024-7539e414a009-kube-api-access-v7gnl\") pod \"must-gather-xb8ps\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:07 crc kubenswrapper[4762]: I0217 14:35:07.077197 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8bfff96d-6c90-4a80-9024-7539e414a009-must-gather-output\") pod \"must-gather-xb8ps\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:07 crc kubenswrapper[4762]: I0217 14:35:07.077618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8bfff96d-6c90-4a80-9024-7539e414a009-must-gather-output\") pod \"must-gather-xb8ps\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:07 crc kubenswrapper[4762]: I0217 14:35:07.115243 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7gnl\" (UniqueName: \"kubernetes.io/projected/8bfff96d-6c90-4a80-9024-7539e414a009-kube-api-access-v7gnl\") pod \"must-gather-xb8ps\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:07 crc kubenswrapper[4762]: I0217 14:35:07.155440 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:35:07 crc kubenswrapper[4762]: I0217 14:35:07.787585 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xsj4g/must-gather-xb8ps"] Feb 17 14:35:08 crc kubenswrapper[4762]: I0217 14:35:08.176970 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" event={"ID":"8bfff96d-6c90-4a80-9024-7539e414a009","Type":"ContainerStarted","Data":"a2b6d658a2c08c657199bccfe57b596ec911d374d1b77da0a3b4bead2251211a"} Feb 17 14:35:11 crc kubenswrapper[4762]: I0217 14:35:11.071106 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:35:11 crc kubenswrapper[4762]: E0217 14:35:11.071671 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:35:15 crc kubenswrapper[4762]: I0217 14:35:15.270569 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" event={"ID":"8bfff96d-6c90-4a80-9024-7539e414a009","Type":"ContainerStarted","Data":"e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23"} Feb 17 14:35:15 crc kubenswrapper[4762]: I0217 14:35:15.271105 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" event={"ID":"8bfff96d-6c90-4a80-9024-7539e414a009","Type":"ContainerStarted","Data":"80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6"} Feb 17 14:35:15 crc kubenswrapper[4762]: I0217 14:35:15.297947 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" podStartSLOduration=2.901572621 podStartE2EDuration="9.29792779s" podCreationTimestamp="2026-02-17 14:35:06 +0000 UTC" firstStartedPulling="2026-02-17 14:35:07.791973082 +0000 UTC m=+1788.371973734" lastFinishedPulling="2026-02-17 14:35:14.188328251 +0000 UTC m=+1794.768328903" observedRunningTime="2026-02-17 14:35:15.282921043 +0000 UTC m=+1795.862921695" watchObservedRunningTime="2026-02-17 14:35:15.29792779 +0000 UTC m=+1795.877928442" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.678611 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xsj4g/crc-debug-xgnxh"] Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.681016 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.685016 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xsj4g"/"default-dockercfg-4hpbt" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.751511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zxvq\" (UniqueName: \"kubernetes.io/projected/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-kube-api-access-6zxvq\") pod \"crc-debug-xgnxh\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.751686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-host\") pod \"crc-debug-xgnxh\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.854467 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zxvq\" (UniqueName: \"kubernetes.io/projected/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-kube-api-access-6zxvq\") pod \"crc-debug-xgnxh\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.854873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-host\") pod \"crc-debug-xgnxh\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.855001 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-host\") pod \"crc-debug-xgnxh\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:19 crc kubenswrapper[4762]: I0217 14:35:19.891364 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zxvq\" (UniqueName: \"kubernetes.io/projected/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-kube-api-access-6zxvq\") pod \"crc-debug-xgnxh\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:20 crc kubenswrapper[4762]: I0217 14:35:20.006930 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:35:20 crc kubenswrapper[4762]: W0217 14:35:20.058254 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc62a5450_285a_4d9d_b5c8_6a4bb248e37d.slice/crio-d1dcbba9bb7f26edb0d94ab5eb10bd68c675827abcc677982965b5baaa720312 WatchSource:0}: Error finding container d1dcbba9bb7f26edb0d94ab5eb10bd68c675827abcc677982965b5baaa720312: Status 404 returned error can't find the container with id d1dcbba9bb7f26edb0d94ab5eb10bd68c675827abcc677982965b5baaa720312 Feb 17 14:35:20 crc kubenswrapper[4762]: I0217 14:35:20.323465 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" event={"ID":"c62a5450-285a-4d9d-b5c8-6a4bb248e37d","Type":"ContainerStarted","Data":"d1dcbba9bb7f26edb0d94ab5eb10bd68c675827abcc677982965b5baaa720312"} Feb 17 14:35:22 crc kubenswrapper[4762]: I0217 14:35:22.071815 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:35:22 crc kubenswrapper[4762]: E0217 14:35:22.072371 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:35:24 crc kubenswrapper[4762]: I0217 14:35:24.726486 4762 scope.go:117] "RemoveContainer" containerID="2374a3728cd95390711955d903773f7b4614b1795c447ce88c14d0a6d7eaaa26" Feb 17 14:35:24 crc kubenswrapper[4762]: I0217 14:35:24.768635 4762 scope.go:117] "RemoveContainer" containerID="04011dc64b4c9f1f4b73753d11fcd7079b50ab16e9d738bd6611369fe1d52847" Feb 17 14:35:24 crc kubenswrapper[4762]: I0217 14:35:24.839927 4762 scope.go:117] "RemoveContainer" containerID="b8efb2c46c08b1153856a7affefe3521f37a0170301d64f770b195f1c329f359" Feb 17 14:35:24 crc kubenswrapper[4762]: I0217 14:35:24.902354 4762 scope.go:117] "RemoveContainer" containerID="630e37dab7f019f6a2702f87903daaf8a2d343b5f5d4e2a8a3d76495731261c0" Feb 17 14:35:25 crc kubenswrapper[4762]: I0217 14:35:25.023437 4762 scope.go:117] "RemoveContainer" containerID="57831539b956592372abb05c0e8265ae6c1b0b4dbde3f14741138fed85b064b3" Feb 17 14:35:25 crc kubenswrapper[4762]: I0217 14:35:25.058699 4762 scope.go:117] "RemoveContainer" containerID="31ca1341142a5a93a903a4b632666e572dc9639b7ed02f26803e5113e0b8521d" Feb 17 14:35:25 crc kubenswrapper[4762]: I0217 14:35:25.139931 4762 scope.go:117] "RemoveContainer" containerID="ffa0682b9630e37ebaeb4bb355fef8eacbfab92142bc4c22ece878abd668ded5" Feb 17 14:35:25 crc kubenswrapper[4762]: I0217 14:35:25.167746 4762 scope.go:117] "RemoveContainer" containerID="5d0df22f7fd59f68d826d32d34c1cbd872159e007a31d5f544c8ef3bc6f3e281" Feb 17 14:35:33 crc kubenswrapper[4762]: I0217 14:35:33.070919 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:35:33 crc kubenswrapper[4762]: E0217 14:35:33.071749 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:35:35 crc kubenswrapper[4762]: I0217 14:35:35.557551 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" event={"ID":"c62a5450-285a-4d9d-b5c8-6a4bb248e37d","Type":"ContainerStarted","Data":"0114be74a9a7fafa9144c6bb345a89d6a976631f4269bd151a35887ce990a5c0"} Feb 17 14:35:35 crc kubenswrapper[4762]: I0217 14:35:35.574294 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" podStartSLOduration=1.3169914010000001 podStartE2EDuration="16.574274368s" podCreationTimestamp="2026-02-17 14:35:19 +0000 UTC" firstStartedPulling="2026-02-17 14:35:20.062094752 +0000 UTC m=+1800.642095414" lastFinishedPulling="2026-02-17 14:35:35.319377729 +0000 UTC m=+1815.899378381" observedRunningTime="2026-02-17 14:35:35.571062741 +0000 UTC m=+1816.151063403" watchObservedRunningTime="2026-02-17 14:35:35.574274368 +0000 UTC m=+1816.154275020" Feb 17 14:35:47 crc kubenswrapper[4762]: I0217 14:35:47.072136 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:35:47 crc kubenswrapper[4762]: E0217 14:35:47.072918 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:35:59 crc kubenswrapper[4762]: I0217 14:35:59.237585 4762 generic.go:334] "Generic (PLEG): container finished" podID="c62a5450-285a-4d9d-b5c8-6a4bb248e37d" containerID="0114be74a9a7fafa9144c6bb345a89d6a976631f4269bd151a35887ce990a5c0" exitCode=0 Feb 17 14:35:59 crc kubenswrapper[4762]: I0217 14:35:59.238155 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" event={"ID":"c62a5450-285a-4d9d-b5c8-6a4bb248e37d","Type":"ContainerDied","Data":"0114be74a9a7fafa9144c6bb345a89d6a976631f4269bd151a35887ce990a5c0"} Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.082760 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:36:00 crc kubenswrapper[4762]: E0217 14:36:00.083177 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.430658 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.473760 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xsj4g/crc-debug-xgnxh"] Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.487976 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xsj4g/crc-debug-xgnxh"] Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.548598 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zxvq\" (UniqueName: \"kubernetes.io/projected/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-kube-api-access-6zxvq\") pod \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.549214 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-host\") pod \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\" (UID: \"c62a5450-285a-4d9d-b5c8-6a4bb248e37d\") " Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.549554 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-host" (OuterVolumeSpecName: "host") pod "c62a5450-285a-4d9d-b5c8-6a4bb248e37d" (UID: "c62a5450-285a-4d9d-b5c8-6a4bb248e37d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.550216 4762 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.554375 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-kube-api-access-6zxvq" (OuterVolumeSpecName: "kube-api-access-6zxvq") pod "c62a5450-285a-4d9d-b5c8-6a4bb248e37d" (UID: "c62a5450-285a-4d9d-b5c8-6a4bb248e37d"). InnerVolumeSpecName "kube-api-access-6zxvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:36:00 crc kubenswrapper[4762]: I0217 14:36:00.652227 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zxvq\" (UniqueName: \"kubernetes.io/projected/c62a5450-285a-4d9d-b5c8-6a4bb248e37d-kube-api-access-6zxvq\") on node \"crc\" DevicePath \"\"" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.265450 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1dcbba9bb7f26edb0d94ab5eb10bd68c675827abcc677982965b5baaa720312" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.265551 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-xgnxh" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.679701 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xsj4g/crc-debug-9wxsz"] Feb 17 14:36:01 crc kubenswrapper[4762]: E0217 14:36:01.680236 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62a5450-285a-4d9d-b5c8-6a4bb248e37d" containerName="container-00" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.680255 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62a5450-285a-4d9d-b5c8-6a4bb248e37d" containerName="container-00" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.680511 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62a5450-285a-4d9d-b5c8-6a4bb248e37d" containerName="container-00" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.681327 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.684666 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xsj4g"/"default-dockercfg-4hpbt" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.826587 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-host\") pod \"crc-debug-9wxsz\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.827063 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v98zm\" (UniqueName: \"kubernetes.io/projected/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-kube-api-access-v98zm\") pod \"crc-debug-9wxsz\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.929043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-host\") pod \"crc-debug-9wxsz\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.929176 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v98zm\" (UniqueName: \"kubernetes.io/projected/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-kube-api-access-v98zm\") pod \"crc-debug-9wxsz\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.929215 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-host\") pod \"crc-debug-9wxsz\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:01 crc kubenswrapper[4762]: I0217 14:36:01.946870 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v98zm\" (UniqueName: \"kubernetes.io/projected/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-kube-api-access-v98zm\") pod \"crc-debug-9wxsz\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:02 crc kubenswrapper[4762]: I0217 14:36:02.004921 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:02 crc kubenswrapper[4762]: I0217 14:36:02.095680 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62a5450-285a-4d9d-b5c8-6a4bb248e37d" path="/var/lib/kubelet/pods/c62a5450-285a-4d9d-b5c8-6a4bb248e37d/volumes" Feb 17 14:36:02 crc kubenswrapper[4762]: I0217 14:36:02.276489 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" event={"ID":"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18","Type":"ContainerStarted","Data":"2124eec7ac4065ad967ae70c2c3d9876bc7e00de389412f57f57fc66139753ff"} Feb 17 14:36:03 crc kubenswrapper[4762]: I0217 14:36:03.290230 4762 generic.go:334] "Generic (PLEG): container finished" podID="7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" containerID="50a7c19ab28f74b0a06f1cebf0acf71291b14e4c9152610044f0a06e19ed8d58" exitCode=1 Feb 17 14:36:03 crc kubenswrapper[4762]: I0217 14:36:03.290292 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" event={"ID":"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18","Type":"ContainerDied","Data":"50a7c19ab28f74b0a06f1cebf0acf71291b14e4c9152610044f0a06e19ed8d58"} Feb 17 14:36:03 crc kubenswrapper[4762]: I0217 14:36:03.332466 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xsj4g/crc-debug-9wxsz"] Feb 17 14:36:03 crc kubenswrapper[4762]: I0217 14:36:03.348383 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xsj4g/crc-debug-9wxsz"] Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.432106 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.499086 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v98zm\" (UniqueName: \"kubernetes.io/projected/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-kube-api-access-v98zm\") pod \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.499908 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-host\") pod \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\" (UID: \"7ae96d33-ebf9-4885-aaf1-dac1acf5eb18\") " Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.501450 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-host" (OuterVolumeSpecName: "host") pod "7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" (UID: "7ae96d33-ebf9-4885-aaf1-dac1acf5eb18"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.507290 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-kube-api-access-v98zm" (OuterVolumeSpecName: "kube-api-access-v98zm") pod "7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" (UID: "7ae96d33-ebf9-4885-aaf1-dac1acf5eb18"). InnerVolumeSpecName "kube-api-access-v98zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.603306 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v98zm\" (UniqueName: \"kubernetes.io/projected/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-kube-api-access-v98zm\") on node \"crc\" DevicePath \"\"" Feb 17 14:36:04 crc kubenswrapper[4762]: I0217 14:36:04.603338 4762 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:36:05 crc kubenswrapper[4762]: I0217 14:36:05.314294 4762 scope.go:117] "RemoveContainer" containerID="50a7c19ab28f74b0a06f1cebf0acf71291b14e4c9152610044f0a06e19ed8d58" Feb 17 14:36:05 crc kubenswrapper[4762]: I0217 14:36:05.314319 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/crc-debug-9wxsz" Feb 17 14:36:06 crc kubenswrapper[4762]: I0217 14:36:06.085782 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" path="/var/lib/kubelet/pods/7ae96d33-ebf9-4885-aaf1-dac1acf5eb18/volumes" Feb 17 14:36:12 crc kubenswrapper[4762]: I0217 14:36:12.071849 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:36:12 crc kubenswrapper[4762]: E0217 14:36:12.072650 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:36:23 crc kubenswrapper[4762]: I0217 14:36:23.071456 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:36:23 crc kubenswrapper[4762]: E0217 14:36:23.072276 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:36:25 crc kubenswrapper[4762]: I0217 14:36:25.501313 4762 scope.go:117] "RemoveContainer" containerID="32a94d62c2e7d2a6766a7870466783bc42e46fbe12f626f85b1a7961462224e0" Feb 17 14:36:25 crc kubenswrapper[4762]: I0217 14:36:25.537793 4762 scope.go:117] "RemoveContainer" containerID="3ca505da16de76261387772b87b6a5926a9c46cd51520a42e4b6302224132fcf" Feb 17 14:36:25 crc kubenswrapper[4762]: I0217 14:36:25.580448 4762 scope.go:117] "RemoveContainer" containerID="40bfadd0be5a49cf632f62cc2d679da6a27b3b7606bb06e8c319ffb998c7a00a" Feb 17 14:36:35 crc kubenswrapper[4762]: I0217 14:36:35.071016 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:36:36 crc kubenswrapper[4762]: I0217 14:36:36.127878 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"7866eecacac248138bc6cd774a1ac22e147432f4d4ced0c1eaa06720947d6b4f"} Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.193164 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_55524ce8-1fb2-4a0c-ad16-e6ba37940c0a/aodh-api/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.242604 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_55524ce8-1fb2-4a0c-ad16-e6ba37940c0a/aodh-evaluator/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.408810 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_55524ce8-1fb2-4a0c-ad16-e6ba37940c0a/aodh-notifier/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.436742 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_55524ce8-1fb2-4a0c-ad16-e6ba37940c0a/aodh-listener/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.533151 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0adb-account-create-update-v2qxg_5b5722df-f962-403c-abfa-793bc821be57/mariadb-account-create-update/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.658076 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-create-phqhg_3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55/mariadb-database-create/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.758543 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-sync-fgpcm_82cbcf38-171c-4676-988f-a742b4277bb6/aodh-db-sync/0.log" Feb 17 14:37:03 crc kubenswrapper[4762]: I0217 14:37:03.902025 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f7475d794-g4jpc_dafb15f9-f633-4acc-a69f-6199b20ae0e7/barbican-api/0.log" Feb 17 14:37:04 crc kubenswrapper[4762]: I0217 14:37:04.042111 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f7475d794-g4jpc_dafb15f9-f633-4acc-a69f-6199b20ae0e7/barbican-api-log/0.log" Feb 17 14:37:04 crc kubenswrapper[4762]: I0217 14:37:04.141218 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-b315-account-create-update-nnnmm_8ad6e8de-6bb3-4a3e-b664-db44abab1875/mariadb-account-create-update/0.log" Feb 17 14:37:04 crc kubenswrapper[4762]: I0217 14:37:04.294609 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-7wqqm_3b691b6d-c42b-491d-a1d0-3c5cb236598b/mariadb-database-create/0.log" Feb 17 14:37:04 crc kubenswrapper[4762]: I0217 14:37:04.371434 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-smktq_a9c276b7-cca9-42c7-8605-5f2bfa0da0e1/barbican-db-sync/0.log" Feb 17 14:37:04 crc kubenswrapper[4762]: I0217 14:37:04.547171 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-665f7bf56b-7d7wz_f6a51610-1744-455d-beff-2204a3452e61/barbican-keystone-listener/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.068412 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-665f7bf56b-7d7wz_f6a51610-1744-455d-beff-2204a3452e61/barbican-keystone-listener-log/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.108346 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-67d8dd69f-j2ffh_a887bb10-111b-4b5e-b2fc-c204129ff11c/barbican-worker/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.188422 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-67d8dd69f-j2ffh_a887bb10-111b-4b5e-b2fc-c204129ff11c/barbican-worker-log/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.336609 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1b6239e-147b-429a-8765-dce18c23d63b/ceilometer-notification-agent/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.360477 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1b6239e-147b-429a-8765-dce18c23d63b/ceilometer-central-agent/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.419030 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1b6239e-147b-429a-8765-dce18c23d63b/proxy-httpd/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.521033 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f1b6239e-147b-429a-8765-dce18c23d63b/sg-core/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.559553 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-a355-account-create-update-wzz5t_ee986585-bdb5-4bed-8002-7cf0a80784a8/mariadb-account-create-update/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.726185 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1e58addf-d172-4f09-b4e5-30b62cafb801/cinder-api/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.754247 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1e58addf-d172-4f09-b4e5-30b62cafb801/cinder-api-log/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.878009 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-lrcjs_93fb932d-6901-44d9-a508-a32692308154/mariadb-database-create/0.log" Feb 17 14:37:05 crc kubenswrapper[4762]: I0217 14:37:05.954667 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-95lkq_d6ea0210-709e-4a47-87d1-48c811c0ab85/cinder-db-sync/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.141580 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_048d8d34-8b8e-4267-9747-2db21026d3a8/cinder-scheduler/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.177382 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_048d8d34-8b8e-4267-9747-2db21026d3a8/probe/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.324850 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f84f9ccf-z9jpf_7ee8353e-dc34-46ac-ace9-d0de5574c65b/init/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.491311 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f84f9ccf-z9jpf_7ee8353e-dc34-46ac-ace9-d0de5574c65b/init/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.554554 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-4bb1-account-create-update-vtj6t_9c65095d-efc4-4480-b244-55169974d63d/mariadb-account-create-update/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.559283 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f84f9ccf-z9jpf_7ee8353e-dc34-46ac-ace9-d0de5574c65b/dnsmasq-dns/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.735858 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-5qq4s_d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab/mariadb-database-create/0.log" Feb 17 14:37:06 crc kubenswrapper[4762]: I0217 14:37:06.797144 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-tt6cp_ddad90d3-b6d4-4a8c-82cd-883fcc0e0574/glance-db-sync/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.006536 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d64001d1-6972-4563-a764-05b359233d62/glance-httpd/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.042200 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d64001d1-6972-4563-a764-05b359233d62/glance-log/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.191811 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c92f5203-d922-420b-9537-34cb7656e78c/glance-httpd/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.206778 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c92f5203-d922-420b-9537-34cb7656e78c/glance-log/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.372362 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-8332-account-create-update-8vvzv_43ed625c-d879-4409-9450-d61b3f7cc686/mariadb-account-create-update/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.486235 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6885f6c5bd-nskzc_58b7d970-aa37-44b3-b64b-a55bcf38f7cb/heat-api/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.647750 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-579766b5b-pgs2q_d0e19e34-aa03-40bc-8f4b-3604a80d6683/heat-cfnapi/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.678347 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-create-z944d_d8300c70-e571-49c5-a403-d645237d7012/mariadb-database-create/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.847799 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-sync-h7qp8_8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3/heat-db-sync/0.log" Feb 17 14:37:07 crc kubenswrapper[4762]: I0217 14:37:07.901018 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-68c7cc4b78-lr6mt_d19729e1-9b79-4762-821b-10ccba91c176/heat-engine/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.036988 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-400c-account-create-update-88mqh_8c69c000-54f6-4b64-a7fa-454fd519aad5/mariadb-account-create-update/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.217563 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-86657f9797-7sk9h_a23de52d-c70a-4f76-b067-cf4fef32b584/keystone-api/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.250582 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-5mknf_53984f9c-be03-44a6-91da-65972a4b4cd5/keystone-bootstrap/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.453761 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-zblds_808ae239-be89-433d-ab1f-8807e658af8d/mariadb-database-create/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.482609 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-q6l4w_8acf7e9f-6215-417b-b385-68b30decf4c8/keystone-db-sync/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.505027 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_398ab3b8-a4d9-48fd-9236-9e7aed43e7d9/kube-state-metrics/0.log" Feb 17 14:37:08 crc kubenswrapper[4762]: I0217 14:37:08.832135 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_ceb5a67d-f2f8-4d60-b90e-8cc0c3599146/mysqld-exporter/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.177994 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-1559-account-create-update-562bx_60202600-f7cc-4623-abf8-d3f1ad5662aa/mariadb-account-create-update/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.267929 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-32e0-account-create-update-fr87w_7e0fb0bc-3e83-444f-8c0d-701c9e0ed873/mariadb-account-create-update/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.433618 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-openstack-cell1-db-create-4q4bb_0270bd57-0aa6-48bf-98ed-d37d70fbb42c/mariadb-database-create/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.575471 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-openstack-db-create-5mzzr_11daea56-42b9-45b6-980a-c6afbe877c80/mariadb-database-create/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.841916 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-558c556c77-d2tbn_af765db9-bd7e-4747-8269-49a27c5f0dc6/neutron-api/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.843635 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-558c556c77-d2tbn_af765db9-bd7e-4747-8269-49a27c5f0dc6/neutron-httpd/0.log" Feb 17 14:37:09 crc kubenswrapper[4762]: I0217 14:37:09.994099 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-be62-account-create-update-sl2zr_cb3e6eca-01ec-4a72-b83c-80183169dbf1/mariadb-account-create-update/0.log" Feb 17 14:37:10 crc kubenswrapper[4762]: I0217 14:37:10.118790 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-tvd94_7220a0cb-7e9b-4648-ae3c-3289c1aa3493/mariadb-database-create/0.log" Feb 17 14:37:10 crc kubenswrapper[4762]: I0217 14:37:10.326107 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-wtc2k_cc27563b-a5bb-4e82-a286-e0628e7c07b3/neutron-db-sync/0.log" Feb 17 14:37:10 crc kubenswrapper[4762]: I0217 14:37:10.503872 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae89a58d-cd03-4c0c-8d74-a683f1d77bf3/nova-api-api/0.log" Feb 17 14:37:10 crc kubenswrapper[4762]: I0217 14:37:10.578368 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ae89a58d-cd03-4c0c-8d74-a683f1d77bf3/nova-api-log/0.log" Feb 17 14:37:10 crc kubenswrapper[4762]: I0217 14:37:10.720024 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0142-account-create-update-9mv69_277ee237-c640-42ab-8439-d23e72f087e1/mariadb-account-create-update/0.log" Feb 17 14:37:10 crc kubenswrapper[4762]: I0217 14:37:10.829262 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-jljhd_bb8711f3-a902-4c23-8c91-3e8819cc74ca/mariadb-database-create/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.015350 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-8886-account-create-update-w9f55_8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8/mariadb-account-create-update/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.065851 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-wdbb8_a4589d86-754e-46ec-bd8f-412abdf21890/nova-manage/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.295891 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-7x82n_92bb66fd-cea7-435b-8915-0641110c25af/nova-cell0-conductor-db-sync/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.382615 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_889ee23b-4c8c-4cc6-a28a-9ed791cbe9b0/nova-cell0-conductor-conductor/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.553169 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-nnss4_da99eccd-0482-4e64-bb27-6b87437ae8ba/mariadb-database-create/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.696096 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-9c9e-account-create-update-2865f_d5fb9f5e-d096-4b3d-82cb-881bcc844cab/mariadb-account-create-update/0.log" Feb 17 14:37:11 crc kubenswrapper[4762]: I0217 14:37:11.862438 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-hmbsl_c15862fc-7a11-484e-8343-c565ddcc60eb/nova-manage/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.035867 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c779d9da-d7c8-4829-b255-a1f4749f0fbe/nova-cell1-conductor-conductor/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.153327 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-9zsnn_5ae10efe-5821-4182-8f8b-bd9c6cc13a4d/nova-cell1-conductor-db-sync/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.266916 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-kz5nv_b6bb5440-4045-43cc-acbd-a61bc6b8efa7/mariadb-database-create/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.434046 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a388c0a6-5d6a-4d70-8527-40ae2f62eca4/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.657370 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_338b2e6a-3e06-422f-8e9b-917735470caa/nova-metadata-log/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.879096 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_338b2e6a-3e06-422f-8e9b-917735470caa/nova-metadata-metadata/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.895536 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bbd5850c-1106-4dd4-a7d7-b13e08eff2f5/mysql-bootstrap/0.log" Feb 17 14:37:12 crc kubenswrapper[4762]: I0217 14:37:12.905522 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0a0b2598-a78d-461c-bd60-6eca94aed9d9/nova-scheduler-scheduler/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.198740 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bbd5850c-1106-4dd4-a7d7-b13e08eff2f5/mysql-bootstrap/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.233801 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bbd5850c-1106-4dd4-a7d7-b13e08eff2f5/galera/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.275492 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3fe6d960-8cae-47d2-86e7-c077f0facaae/mysql-bootstrap/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.494951 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3fe6d960-8cae-47d2-86e7-c077f0facaae/galera/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.522132 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c9dd2323-04a9-409b-b035-7d086e4eaef6/openstackclient/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.522741 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3fe6d960-8cae-47d2-86e7-c077f0facaae/mysql-bootstrap/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.685546 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7s7b5_3c6069ca-94f7-439c-9434-0d79b4e56500/openstack-network-exporter/0.log" Feb 17 14:37:13 crc kubenswrapper[4762]: I0217 14:37:13.929810 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7gshj_549db29e-a842-49dc-8b6b-1fe3f83857da/ovsdb-server-init/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.112681 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7gshj_549db29e-a842-49dc-8b6b-1fe3f83857da/ovs-vswitchd/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.131828 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7gshj_549db29e-a842-49dc-8b6b-1fe3f83857da/ovsdb-server/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.166517 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7gshj_549db29e-a842-49dc-8b6b-1fe3f83857da/ovsdb-server-init/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.307085 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xspft_0611dcb7-08c7-4999-8bc2-210224f89e66/ovn-controller/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.415094 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_35249c1a-ea4f-419c-91be-dfee3dbf3303/ovn-northd/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.459429 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_35249c1a-ea4f-419c-91be-dfee3dbf3303/openstack-network-exporter/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.640364 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_de4ebcd7-ede5-4a4a-aed5-55d31eee13bf/openstack-network-exporter/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.672838 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_de4ebcd7-ede5-4a4a-aed5-55d31eee13bf/ovsdbserver-nb/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.901127 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b848d44f-ad87-4491-a0af-c2028ee1827b/ovsdbserver-sb/0.log" Feb 17 14:37:14 crc kubenswrapper[4762]: I0217 14:37:14.928611 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b848d44f-ad87-4491-a0af-c2028ee1827b/openstack-network-exporter/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.021173 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-74c5954b4-v4d8z_c64547d6-018c-4123-9017-3f5ef64949b2/placement-api/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.115165 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-74c5954b4-v4d8z_c64547d6-018c-4123-9017-3f5ef64949b2/placement-log/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.221169 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-a199-account-create-update-hxcrn_46085b5b-97db-43a2-9a40-b6fc4c6d4f60/mariadb-account-create-update/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.380185 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-njdl7_3cb9fb92-bfd5-48fc-8d6f-1b616a958e25/mariadb-database-create/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.483093 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-lq7n6_8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64/placement-db-sync/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.660947 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bad07381-6a78-4418-b451-0521ee7d95f9/init-config-reloader/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.894972 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bad07381-6a78-4418-b451-0521ee7d95f9/init-config-reloader/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.911301 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bad07381-6a78-4418-b451-0521ee7d95f9/config-reloader/0.log" Feb 17 14:37:15 crc kubenswrapper[4762]: I0217 14:37:15.968685 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bad07381-6a78-4418-b451-0521ee7d95f9/thanos-sidecar/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.025831 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bad07381-6a78-4418-b451-0521ee7d95f9/prometheus/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.143248 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c34ffbd-b33d-4579-8a4d-a51ef852b1a1/setup-container/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.396040 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c34ffbd-b33d-4579-8a4d-a51ef852b1a1/setup-container/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.420956 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c34ffbd-b33d-4579-8a4d-a51ef852b1a1/rabbitmq/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.488845 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_12862d08-7816-4a6d-9a52-aceeae5e1d8e/setup-container/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.682398 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_12862d08-7816-4a6d-9a52-aceeae5e1d8e/setup-container/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.742533 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_12862d08-7816-4a6d-9a52-aceeae5e1d8e/rabbitmq/0.log" Feb 17 14:37:16 crc kubenswrapper[4762]: I0217 14:37:16.817697 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_d23bccd7-14f7-419d-95db-38470afb02b0/setup-container/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.074497 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_d23bccd7-14f7-419d-95db-38470afb02b0/setup-container/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.075347 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_391886d8-341f-4e66-980c-00f6cd881e10/setup-container/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.111834 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_d23bccd7-14f7-419d-95db-38470afb02b0/rabbitmq/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.316597 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_391886d8-341f-4e66-980c-00f6cd881e10/rabbitmq/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.320274 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_391886d8-341f-4e66-980c-00f6cd881e10/setup-container/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.438679 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-lq7w9_16658e34-885b-4693-9784-bd985a6acd52/mariadb-account-create-update/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.666492 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5bfd9c8d59-mxmfg_849ff889-c3dd-4ae3-b103-b49b6ad2535d/proxy-httpd/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.693838 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5bfd9c8d59-mxmfg_849ff889-c3dd-4ae3-b103-b49b6ad2535d/proxy-server/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.949212 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/account-auditor/0.log" Feb 17 14:37:17 crc kubenswrapper[4762]: I0217 14:37:17.950836 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-674vl_f6083b27-9cd4-494a-8b51-9dff95918001/swift-ring-rebalance/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.026518 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/account-reaper/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.173154 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/container-auditor/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.174342 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/account-server/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.206033 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/account-replicator/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.304258 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/container-replicator/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.414720 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/container-server/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.447168 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/container-updater/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.449207 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/object-auditor/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.558371 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/object-expirer/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.638494 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/object-server/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.658634 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/object-replicator/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.731217 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/object-updater/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.822369 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/rsync/0.log" Feb 17 14:37:18 crc kubenswrapper[4762]: I0217 14:37:18.874292 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_466a7dc3-63d2-4995-ab6f-712df183303d/swift-recon-cron/0.log" Feb 17 14:37:19 crc kubenswrapper[4762]: I0217 14:37:19.973143 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b6a0797a-2f28-4b9e-ba3d-7151ab86bd4c/memcached/0.log" Feb 17 14:37:49 crc kubenswrapper[4762]: I0217 14:37:49.668410 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/util/0.log" Feb 17 14:37:50 crc kubenswrapper[4762]: I0217 14:37:50.437461 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/util/0.log" Feb 17 14:37:50 crc kubenswrapper[4762]: I0217 14:37:50.489422 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/pull/0.log" Feb 17 14:37:50 crc kubenswrapper[4762]: I0217 14:37:50.550466 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/pull/0.log" Feb 17 14:37:50 crc kubenswrapper[4762]: I0217 14:37:50.989832 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/pull/0.log" Feb 17 14:37:51 crc kubenswrapper[4762]: I0217 14:37:51.006341 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/extract/0.log" Feb 17 14:37:51 crc kubenswrapper[4762]: I0217 14:37:51.030123 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae61tmqwh_0f03ab51-9f15-43df-b897-d62a6e067994/util/0.log" Feb 17 14:37:51 crc kubenswrapper[4762]: I0217 14:37:51.727324 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-ftcx6_bfc8279b-f4c4-4e89-8663-1b4ba1c25ba1/manager/0.log" Feb 17 14:37:52 crc kubenswrapper[4762]: I0217 14:37:52.163991 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-spgjw_6b5af5f5-ea83-427b-b987-f6215d329670/manager/0.log" Feb 17 14:37:52 crc kubenswrapper[4762]: I0217 14:37:52.727498 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-ww45l_f2be497a-b70f-49ca-880e-9675bfd83a93/manager/0.log" Feb 17 14:37:52 crc kubenswrapper[4762]: I0217 14:37:52.844677 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-6mbwp_09b86f06-6cae-45aa-8e1e-8de6408dae32/manager/0.log" Feb 17 14:37:53 crc kubenswrapper[4762]: I0217 14:37:53.713118 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-2k62f_2ebeafd3-8c4c-4473-b382-7f190a92096a/manager/0.log" Feb 17 14:37:53 crc kubenswrapper[4762]: I0217 14:37:53.731499 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-x847n_6a22270e-2c9e-48d2-8554-8885a67fa92d/manager/0.log" Feb 17 14:37:53 crc kubenswrapper[4762]: I0217 14:37:53.760203 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-rnh4n_004074b2-55cb-4596-84e6-b715ec66bd2c/manager/0.log" Feb 17 14:37:55 crc kubenswrapper[4762]: I0217 14:37:55.012210 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-gtjx5_9c5eb531-17f0-4eae-a0a6-f44f2ca0da97/manager/0.log" Feb 17 14:37:55 crc kubenswrapper[4762]: I0217 14:37:55.024716 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-kt8qn_0178fd98-dd5b-43f5-b2cd-d118b3803888/manager/0.log" Feb 17 14:37:55 crc kubenswrapper[4762]: I0217 14:37:55.707465 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-wwhs6_0cf7a5f5-8168-4054-8aba-55315da55d18/manager/0.log" Feb 17 14:37:55 crc kubenswrapper[4762]: I0217 14:37:55.732884 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-74hcc_0c922b97-d376-45cc-986d-c13735e6c43e/manager/0.log" Feb 17 14:37:56 crc kubenswrapper[4762]: I0217 14:37:56.265197 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-jh42l_b570b810-b8a4-4ca0-89d5-3992368a4867/manager/0.log" Feb 17 14:37:56 crc kubenswrapper[4762]: I0217 14:37:56.533780 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9czq5nr_6abe751d-7643-4aa7-a843-bbde4ed4a457/manager/0.log" Feb 17 14:37:57 crc kubenswrapper[4762]: I0217 14:37:57.228962 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7464dc569f-ggt7c_517df0cc-d4c5-41f7-aa3d-53b2830f427c/operator/0.log" Feb 17 14:37:57 crc kubenswrapper[4762]: I0217 14:37:57.450914 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sh6w6_f96d5046-7e85-41d7-b333-a5d22ef1e541/registry-server/0.log" Feb 17 14:37:58 crc kubenswrapper[4762]: I0217 14:37:58.298265 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-qbgn5_2d3c8e1f-e388-467a-a744-5c332868bde3/manager/0.log" Feb 17 14:37:58 crc kubenswrapper[4762]: I0217 14:37:58.928625 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-jtvhg_4414da08-4cca-4b53-b590-3511e77060e0/manager/0.log" Feb 17 14:37:59 crc kubenswrapper[4762]: I0217 14:37:59.360864 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6pl9x_4d1822b6-73cd-4b72-9c6e-415b9cfb0e4d/operator/0.log" Feb 17 14:37:59 crc kubenswrapper[4762]: I0217 14:37:59.613045 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-667f54696f-gddhj_2dd899d8-8882-45e1-952a-e4103384ac4c/manager/0.log" Feb 17 14:37:59 crc kubenswrapper[4762]: I0217 14:37:59.617638 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-xg6kw_149d4551-5870-46cb-871b-8a0e5dd25508/manager/0.log" Feb 17 14:37:59 crc kubenswrapper[4762]: I0217 14:37:59.628317 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-jkgwj_afb78ebd-d200-4441-a12f-e1e63dfb71d9/manager/0.log" Feb 17 14:37:59 crc kubenswrapper[4762]: I0217 14:37:59.856460 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-2hv4z_f1d7b36c-7d66-4e34-a412-fbbf64b6e9eb/manager/0.log" Feb 17 14:38:00 crc kubenswrapper[4762]: I0217 14:38:00.225370 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-bzgvz_a7230b0a-9b7e-4430-843d-7754ba5dc370/manager/0.log" Feb 17 14:38:00 crc kubenswrapper[4762]: I0217 14:38:00.265061 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d6964fcdb-5jb4z_ee6bd164-eb6d-462f-96c1-39bdf3ea7b1e/manager/0.log" Feb 17 14:38:02 crc kubenswrapper[4762]: I0217 14:38:02.277136 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-4bg4h_6b0c5012-70b1-42f3-9bf1-734acf6a8f2f/manager/0.log" Feb 17 14:38:23 crc kubenswrapper[4762]: I0217 14:38:23.572298 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-g7x76_47a2ded9-7d7e-48b5-b45c-d4adcebc60c1/control-plane-machine-set-operator/0.log" Feb 17 14:38:23 crc kubenswrapper[4762]: I0217 14:38:23.773836 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wpkmz_3b826bc6-e50e-4b2c-8737-254c6d743ad8/kube-rbac-proxy/0.log" Feb 17 14:38:23 crc kubenswrapper[4762]: I0217 14:38:23.774555 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wpkmz_3b826bc6-e50e-4b2c-8737-254c6d743ad8/machine-api-operator/0.log" Feb 17 14:38:25 crc kubenswrapper[4762]: I0217 14:38:25.770446 4762 scope.go:117] "RemoveContainer" containerID="2c899ca16dbffc9ffd16c176d1a5962956dfca67f29dc0f5ed988a1d66008235" Feb 17 14:38:25 crc kubenswrapper[4762]: I0217 14:38:25.804463 4762 scope.go:117] "RemoveContainer" containerID="33b44dc7093f08ac9b8db042dc7d3a5ae8459428ed86fa37213473b5159d80d0" Feb 17 14:38:25 crc kubenswrapper[4762]: I0217 14:38:25.842138 4762 scope.go:117] "RemoveContainer" containerID="d5637ab010ca30227d0f7953c7c27e73d747e7dceb945206c765e4da83221f3c" Feb 17 14:38:34 crc kubenswrapper[4762]: I0217 14:38:34.058468 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-1559-account-create-update-562bx"] Feb 17 14:38:34 crc kubenswrapper[4762]: I0217 14:38:34.108122 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-1559-account-create-update-562bx"] Feb 17 14:38:36 crc kubenswrapper[4762]: I0217 14:38:36.090027 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60202600-f7cc-4623-abf8-d3f1ad5662aa" path="/var/lib/kubelet/pods/60202600-f7cc-4623-abf8-d3f1ad5662aa/volumes" Feb 17 14:38:36 crc kubenswrapper[4762]: I0217 14:38:36.406714 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-5fk9z_24448600-d00c-44b6-a1d9-08ce0d5cd43c/cert-manager-controller/0.log" Feb 17 14:38:36 crc kubenswrapper[4762]: I0217 14:38:36.518554 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-27rxl_2dd817de-0e2d-40fe-ba7d-036a6e1247dd/cert-manager-cainjector/0.log" Feb 17 14:38:36 crc kubenswrapper[4762]: I0217 14:38:36.620324 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dpg84_9233ba97-592c-4c1d-9326-c726d6d43f12/cert-manager-webhook/0.log" Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.050430 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5mzzr"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.061442 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-njdl7"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.074342 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4bb1-account-create-update-vtj6t"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.087987 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-5mzzr"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.106820 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zblds"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.127931 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a199-account-create-update-hxcrn"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.140907 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4bb1-account-create-update-vtj6t"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.152454 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-njdl7"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.162918 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zblds"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.176926 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a199-account-create-update-hxcrn"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.186705 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-400c-account-create-update-88mqh"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.198521 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-400c-account-create-update-88mqh"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.209358 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5qq4s"] Feb 17 14:38:39 crc kubenswrapper[4762]: I0217 14:38:39.220992 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5qq4s"] Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.090390 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11daea56-42b9-45b6-980a-c6afbe877c80" path="/var/lib/kubelet/pods/11daea56-42b9-45b6-980a-c6afbe877c80/volumes" Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.093221 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb9fb92-bfd5-48fc-8d6f-1b616a958e25" path="/var/lib/kubelet/pods/3cb9fb92-bfd5-48fc-8d6f-1b616a958e25/volumes" Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.094978 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46085b5b-97db-43a2-9a40-b6fc4c6d4f60" path="/var/lib/kubelet/pods/46085b5b-97db-43a2-9a40-b6fc4c6d4f60/volumes" Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.096401 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808ae239-be89-433d-ab1f-8807e658af8d" path="/var/lib/kubelet/pods/808ae239-be89-433d-ab1f-8807e658af8d/volumes" Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.098556 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c69c000-54f6-4b64-a7fa-454fd519aad5" path="/var/lib/kubelet/pods/8c69c000-54f6-4b64-a7fa-454fd519aad5/volumes" Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.100572 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c65095d-efc4-4480-b244-55169974d63d" path="/var/lib/kubelet/pods/9c65095d-efc4-4480-b244-55169974d63d/volumes" Feb 17 14:38:40 crc kubenswrapper[4762]: I0217 14:38:40.101946 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab" path="/var/lib/kubelet/pods/d0f5362f-c5e9-4e05-8a7d-6071fa53c4ab/volumes" Feb 17 14:38:46 crc kubenswrapper[4762]: I0217 14:38:46.033978 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb"] Feb 17 14:38:46 crc kubenswrapper[4762]: I0217 14:38:46.046489 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4q4bb"] Feb 17 14:38:46 crc kubenswrapper[4762]: I0217 14:38:46.088118 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0270bd57-0aa6-48bf-98ed-d37d70fbb42c" path="/var/lib/kubelet/pods/0270bd57-0aa6-48bf-98ed-d37d70fbb42c/volumes" Feb 17 14:38:47 crc kubenswrapper[4762]: I0217 14:38:47.043405 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-32e0-account-create-update-fr87w"] Feb 17 14:38:47 crc kubenswrapper[4762]: I0217 14:38:47.063361 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-32e0-account-create-update-fr87w"] Feb 17 14:38:48 crc kubenswrapper[4762]: I0217 14:38:48.099174 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0fb0bc-3e83-444f-8c0d-701c9e0ed873" path="/var/lib/kubelet/pods/7e0fb0bc-3e83-444f-8c0d-701c9e0ed873/volumes" Feb 17 14:38:51 crc kubenswrapper[4762]: I0217 14:38:51.137367 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-mwkcm_676a0670-76e5-4a67-8afc-9e69c1561f26/nmstate-console-plugin/0.log" Feb 17 14:38:51 crc kubenswrapper[4762]: I0217 14:38:51.339566 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-chbj9_384f1796-2d88-476c-be59-1abc8ee06efb/nmstate-handler/0.log" Feb 17 14:38:51 crc kubenswrapper[4762]: I0217 14:38:51.474057 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-pg2bv_d8c030bf-f09b-4f2d-9db7-b167348f912f/kube-rbac-proxy/0.log" Feb 17 14:38:51 crc kubenswrapper[4762]: I0217 14:38:51.483716 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-pg2bv_d8c030bf-f09b-4f2d-9db7-b167348f912f/nmstate-metrics/0.log" Feb 17 14:38:51 crc kubenswrapper[4762]: I0217 14:38:51.611896 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-ctz7n_7b234a38-b4bf-43c7-b406-127d6df3b021/nmstate-operator/0.log" Feb 17 14:38:51 crc kubenswrapper[4762]: I0217 14:38:51.714812 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-tlsn7_1a3455d0-6909-41ab-9c83-f5a96c9858d1/nmstate-webhook/0.log" Feb 17 14:38:54 crc kubenswrapper[4762]: I0217 14:38:54.621413 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:38:54 crc kubenswrapper[4762]: I0217 14:38:54.622121 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:39:05 crc kubenswrapper[4762]: I0217 14:39:05.882203 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59cfb98864-gc6tj_425e262b-13e9-474a-85f5-1a0501569aa9/kube-rbac-proxy/0.log" Feb 17 14:39:06 crc kubenswrapper[4762]: I0217 14:39:06.034448 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59cfb98864-gc6tj_425e262b-13e9-474a-85f5-1a0501569aa9/manager/0.log" Feb 17 14:39:15 crc kubenswrapper[4762]: I0217 14:39:15.047478 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lrcjs"] Feb 17 14:39:15 crc kubenswrapper[4762]: I0217 14:39:15.061513 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lrcjs"] Feb 17 14:39:16 crc kubenswrapper[4762]: I0217 14:39:16.085139 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fb932d-6901-44d9-a508-a32692308154" path="/var/lib/kubelet/pods/93fb932d-6901-44d9-a508-a32692308154/volumes" Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.184980 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-be62-account-create-update-sl2zr"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.196588 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-8332-account-create-update-8vvzv"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.210760 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-be62-account-create-update-sl2zr"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.224359 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7wqqm"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.240459 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tvd94"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.251468 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-z944d"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.262546 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tvd94"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.274371 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-8332-account-create-update-8vvzv"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.286100 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7wqqm"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.299163 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-z944d"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.310378 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a355-account-create-update-wzz5t"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.321702 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b315-account-create-update-nnnmm"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.333212 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a355-account-create-update-wzz5t"] Feb 17 14:39:21 crc kubenswrapper[4762]: I0217 14:39:21.345636 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b315-account-create-update-nnnmm"] Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.014593 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-csbmw_d135e9df-e707-48e4-a0ad-0d400cb5b0c8/prometheus-operator/0.log" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.093204 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b691b6d-c42b-491d-a1d0-3c5cb236598b" path="/var/lib/kubelet/pods/3b691b6d-c42b-491d-a1d0-3c5cb236598b/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.095236 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ed625c-d879-4409-9450-d61b3f7cc686" path="/var/lib/kubelet/pods/43ed625c-d879-4409-9450-d61b3f7cc686/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.097789 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7220a0cb-7e9b-4648-ae3c-3289c1aa3493" path="/var/lib/kubelet/pods/7220a0cb-7e9b-4648-ae3c-3289c1aa3493/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.101001 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad6e8de-6bb3-4a3e-b664-db44abab1875" path="/var/lib/kubelet/pods/8ad6e8de-6bb3-4a3e-b664-db44abab1875/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.105273 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3e6eca-01ec-4a72-b83c-80183169dbf1" path="/var/lib/kubelet/pods/cb3e6eca-01ec-4a72-b83c-80183169dbf1/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.106842 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8300c70-e571-49c5-a403-d645237d7012" path="/var/lib/kubelet/pods/d8300c70-e571-49c5-a403-d645237d7012/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.107810 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee986585-bdb5-4bed-8002-7cf0a80784a8" path="/var/lib/kubelet/pods/ee986585-bdb5-4bed-8002-7cf0a80784a8/volumes" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.201572 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_d126b4fc-9d8e-4886-8f76-53268a51258b/prometheus-operator-admission-webhook/0.log" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.330439 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_77607659-a202-47d9-8358-aa339e9ce99d/prometheus-operator-admission-webhook/0.log" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.488588 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-fb6t4_5d34e0ae-c3d1-4d05-8a59-ca531de00d98/operator/0.log" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.543252 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-656mp_0e153059-08c6-4155-af14-f724a156b6fd/observability-ui-dashboards/0.log" Feb 17 14:39:22 crc kubenswrapper[4762]: I0217 14:39:22.699565 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-788lp_9decd9a9-2c51-42dc-8fed-78efbe4c828e/perses-operator/0.log" Feb 17 14:39:24 crc kubenswrapper[4762]: I0217 14:39:24.621944 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:39:24 crc kubenswrapper[4762]: I0217 14:39:24.622500 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:39:25 crc kubenswrapper[4762]: I0217 14:39:25.989899 4762 scope.go:117] "RemoveContainer" containerID="3dcc57905933c53b081cbe5b6724219a68df8eca2edf14101a8004213f41dd23" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.036241 4762 scope.go:117] "RemoveContainer" containerID="a1440e9dafbe555aae2a489afab3b11a1e4730a420a470ef5f9c6ab1f6712e72" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.096346 4762 scope.go:117] "RemoveContainer" containerID="e78f423ef5b9833e47c7d8dc53eaeeb83fee497be745e0ddaccd591008b6d099" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.166749 4762 scope.go:117] "RemoveContainer" containerID="d71554e5eab2f9324767fa0ce932a2d26c3a6a4bd329fc5dd75e3dde4406cefa" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.224105 4762 scope.go:117] "RemoveContainer" containerID="7b78434d42294952137d4e9b42996fd1d92e1096fa03ab5d7c829ec188c416fa" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.430112 4762 scope.go:117] "RemoveContainer" containerID="9f1ce5996958f9dc7ad6f6950a8991ff22e19800bb34ab246870e6e484d2caab" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.478660 4762 scope.go:117] "RemoveContainer" containerID="228fb8a43a6cd143d797a569a730b494dc088b00a3f6bd259e1c0e21a9f7450b" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.503904 4762 scope.go:117] "RemoveContainer" containerID="6b585fc1d7e508864bf3c545229786358225e1d6cca453ad147dcb0c79b40189" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.523833 4762 scope.go:117] "RemoveContainer" containerID="33019fb54e609722ced569220097be6a3a2c7d1b6c067eae11eb22ac2b1cb78e" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.553787 4762 scope.go:117] "RemoveContainer" containerID="b89fd92eb8a368b84e6a672c76e39069e38c02895857ae1e77aa283881d886ed" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.573398 4762 scope.go:117] "RemoveContainer" containerID="1d12a4cd06030465a4e1570620e4ca6e43f5d9d69b19757e8a38e91a258121ec" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.603241 4762 scope.go:117] "RemoveContainer" containerID="01cf411bdaa952701750a9df2a25a47608282543566e90ccf00178957239f1ce" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.632793 4762 scope.go:117] "RemoveContainer" containerID="785cbb491cbe5df25dbc9964a71629fcc710851a6d6098ddbc88a1fd90c4a699" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.655137 4762 scope.go:117] "RemoveContainer" containerID="799f0be8de6774ac888492558e975cbeba5b8650dabba95c8964353f2b8866b6" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.677065 4762 scope.go:117] "RemoveContainer" containerID="6705dec66fd79dde4dbcc153b9f177713ac34f9c71bcb883d6b9433d01f8d9be" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.703293 4762 scope.go:117] "RemoveContainer" containerID="f76f0a45f4c784522da9919e5d767233cb61dece1943b8b5e5308eda5839e74e" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.727317 4762 scope.go:117] "RemoveContainer" containerID="aa92c3b100e57f65921e0e3059e1b58d730bba3b1aa114fbd82fb24afede67a2" Feb 17 14:39:26 crc kubenswrapper[4762]: I0217 14:39:26.748320 4762 scope.go:117] "RemoveContainer" containerID="bbd66e54a094fa112a253b7ef7051fb419564765ab4f001b118d257c18b4e927" Feb 17 14:39:28 crc kubenswrapper[4762]: I0217 14:39:28.053702 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tt6cp"] Feb 17 14:39:28 crc kubenswrapper[4762]: I0217 14:39:28.087607 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-q6l4w"] Feb 17 14:39:28 crc kubenswrapper[4762]: I0217 14:39:28.087944 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-q6l4w"] Feb 17 14:39:28 crc kubenswrapper[4762]: I0217 14:39:28.098580 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tt6cp"] Feb 17 14:39:30 crc kubenswrapper[4762]: I0217 14:39:30.089053 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8acf7e9f-6215-417b-b385-68b30decf4c8" path="/var/lib/kubelet/pods/8acf7e9f-6215-417b-b385-68b30decf4c8/volumes" Feb 17 14:39:30 crc kubenswrapper[4762]: I0217 14:39:30.090466 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddad90d3-b6d4-4a8c-82cd-883fcc0e0574" path="/var/lib/kubelet/pods/ddad90d3-b6d4-4a8c-82cd-883fcc0e0574/volumes" Feb 17 14:39:43 crc kubenswrapper[4762]: I0217 14:39:43.036502 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lq7w9"] Feb 17 14:39:43 crc kubenswrapper[4762]: I0217 14:39:43.051777 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lq7w9"] Feb 17 14:39:43 crc kubenswrapper[4762]: I0217 14:39:43.655945 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-c769fd969-m424n_4207d6ad-eef4-44d0-9eb5-814f9ec323ad/cluster-logging-operator/0.log" Feb 17 14:39:43 crc kubenswrapper[4762]: I0217 14:39:43.873163 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-4jmff_a515723d-c024-422f-ae28-6e5b5daeea76/collector/0.log" Feb 17 14:39:43 crc kubenswrapper[4762]: I0217 14:39:43.948796 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_42d848f4-d4aa-4ed4-a7e9-afd29cdc2c8c/loki-compactor/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.341390 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16658e34-885b-4693-9784-bd985a6acd52" path="/var/lib/kubelet/pods/16658e34-885b-4693-9784-bd985a6acd52/volumes" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.380111 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5d5548c9f5-4kq9t_c3a5bdf4-0c8a-4dd6-bfdc-d5167fb1a6e1/loki-distributor/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.425691 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-78d96f4c68-9bhm5_a4bee09c-f081-4ca0-aef8-40effbd263dd/gateway/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.511924 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-78d96f4c68-9bhm5_a4bee09c-f081-4ca0-aef8-40effbd263dd/opa/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.617464 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-78d96f4c68-sf9z2_8a1683ec-0421-4086-8422-8a638b768879/gateway/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.667037 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-78d96f4c68-sf9z2_8a1683ec-0421-4086-8422-8a638b768879/opa/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.808961 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_c6d7c750-d784-4839-b9a6-8dc6348e3a7c/loki-index-gateway/0.log" Feb 17 14:39:44 crc kubenswrapper[4762]: I0217 14:39:44.922125 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_f7a72999-d771-4b3e-ba91-38078274aa35/loki-ingester/0.log" Feb 17 14:39:45 crc kubenswrapper[4762]: I0217 14:39:45.024663 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76bf7b6d45-rfqd7_5fed95ad-ee31-4f63-a4ef-4eaf471c49ee/loki-querier/0.log" Feb 17 14:39:45 crc kubenswrapper[4762]: I0217 14:39:45.769636 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6d6859c548-lm9mq_6b87d089-b22d-483e-88c7-4d4c2e13c566/loki-query-frontend/0.log" Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.622097 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.622681 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.622751 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.623793 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7866eecacac248138bc6cd774a1ac22e147432f4d4ced0c1eaa06720947d6b4f"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.623864 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://7866eecacac248138bc6cd774a1ac22e147432f4d4ced0c1eaa06720947d6b4f" gracePeriod=600 Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.776219 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="7866eecacac248138bc6cd774a1ac22e147432f4d4ced0c1eaa06720947d6b4f" exitCode=0 Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.776274 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"7866eecacac248138bc6cd774a1ac22e147432f4d4ced0c1eaa06720947d6b4f"} Feb 17 14:39:54 crc kubenswrapper[4762]: I0217 14:39:54.776350 4762 scope.go:117] "RemoveContainer" containerID="50c8de832d208cc3dce00abede55cbc12c20e0b90b960c7d2476f0be0f5efd46" Feb 17 14:39:55 crc kubenswrapper[4762]: I0217 14:39:55.792421 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerStarted","Data":"c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077"} Feb 17 14:40:04 crc kubenswrapper[4762]: I0217 14:40:04.790495 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-fblcw_e37a158f-5b24-474c-9405-fc86bef30818/kube-rbac-proxy/0.log" Feb 17 14:40:04 crc kubenswrapper[4762]: I0217 14:40:04.940470 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-fblcw_e37a158f-5b24-474c-9405-fc86bef30818/controller/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.068143 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-frr-files/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.260265 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-reloader/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.281577 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-frr-files/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.313182 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-metrics/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.316730 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-reloader/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.750542 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-metrics/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.766469 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-reloader/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.780775 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-metrics/0.log" Feb 17 14:40:05 crc kubenswrapper[4762]: I0217 14:40:05.792030 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-frr-files/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.055796 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/controller/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.065952 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-frr-files/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.073950 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-metrics/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.079837 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/cp-reloader/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.767723 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/frr-metrics/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.768526 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/kube-rbac-proxy/0.log" Feb 17 14:40:06 crc kubenswrapper[4762]: I0217 14:40:06.768971 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/kube-rbac-proxy-frr/0.log" Feb 17 14:40:07 crc kubenswrapper[4762]: I0217 14:40:07.022924 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-bd9n7_eb14da33-81db-4b59-8325-af90620744fe/frr-k8s-webhook-server/0.log" Feb 17 14:40:07 crc kubenswrapper[4762]: I0217 14:40:07.053615 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/reloader/0.log" Feb 17 14:40:07 crc kubenswrapper[4762]: I0217 14:40:07.279932 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-55bbdb8f74-wdnm5_ecb19ca9-7000-48bf-b390-37343271ee18/manager/0.log" Feb 17 14:40:07 crc kubenswrapper[4762]: I0217 14:40:07.545394 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cf86c5464-wt796_3838870d-4c8c-4055-a512-454c8d7bf205/webhook-server/0.log" Feb 17 14:40:07 crc kubenswrapper[4762]: I0217 14:40:07.653518 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w6fdr_89cf356f-3fde-40db-9749-8f0bd5f61407/kube-rbac-proxy/0.log" Feb 17 14:40:07 crc kubenswrapper[4762]: I0217 14:40:07.759340 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kmqrr_8ff3f905-182a-4670-9789-efea7744fa7a/frr/0.log" Feb 17 14:40:08 crc kubenswrapper[4762]: I0217 14:40:08.098208 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w6fdr_89cf356f-3fde-40db-9749-8f0bd5f61407/speaker/0.log" Feb 17 14:40:17 crc kubenswrapper[4762]: I0217 14:40:17.054964 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wtc2k"] Feb 17 14:40:17 crc kubenswrapper[4762]: I0217 14:40:17.070289 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wtc2k"] Feb 17 14:40:18 crc kubenswrapper[4762]: I0217 14:40:18.084937 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc27563b-a5bb-4e82-a286-e0628e7c07b3" path="/var/lib/kubelet/pods/cc27563b-a5bb-4e82-a286-e0628e7c07b3/volumes" Feb 17 14:40:22 crc kubenswrapper[4762]: I0217 14:40:22.751471 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/util/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.004187 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/pull/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.007940 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/pull/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.020260 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/util/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.280912 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/util/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.282398 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/pull/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.290972 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e199cf6q_4ec02e2b-3e6e-4ec8-8690-37aefcf86ab5/extract/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.468002 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/util/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.639881 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/pull/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.648843 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/util/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.690027 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/pull/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.892123 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/pull/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.902475 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/extract/0.log" Feb 17 14:40:23 crc kubenswrapper[4762]: I0217 14:40:23.906837 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08q49rz_2c0144bd-21f9-4515-909e-dfc320b5e239/util/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.051307 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/util/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.231111 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/util/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.266774 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/pull/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.301265 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/pull/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.437436 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/util/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.459815 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/pull/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.469509 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thxt6_f00bbd70-901c-4a63-a6b4-ca6a97f6df6f/extract/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.613013 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/extract-utilities/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.768776 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/extract-utilities/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.787525 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/extract-content/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.788654 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/extract-content/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.956268 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/extract-content/0.log" Feb 17 14:40:24 crc kubenswrapper[4762]: I0217 14:40:24.963223 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/extract-utilities/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.254919 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/extract-utilities/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.368888 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hrk6m_f2458360-5ec8-41fa-a098-9cf66b726192/registry-server/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.381473 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/extract-utilities/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.427209 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/extract-content/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.476309 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/extract-content/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.648149 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/extract-utilities/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.677170 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/extract-content/0.log" Feb 17 14:40:25 crc kubenswrapper[4762]: I0217 14:40:25.881359 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/util/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.151191 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-blnm9_c3e8a03a-97a3-4727-84ef-9683f533aa17/registry-server/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.160927 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/util/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.196753 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/pull/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.196781 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/pull/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.366527 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/util/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.368319 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/extract/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.415035 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e0898994fld_0b88810f-7e51-448f-91a4-327a41a07307/pull/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.525578 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/util/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.718871 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/util/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.727723 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/pull/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.763828 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/pull/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.877593 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/util/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.911621 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/pull/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.916889 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakm26m_ce29a95a-c876-4e03-8b7c-89994be40488/extract/0.log" Feb 17 14:40:26 crc kubenswrapper[4762]: I0217 14:40:26.983498 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kpxwm_01244fb5-02d9-4328-ba6a-018283f64d07/marketplace-operator/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.076410 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/extract-utilities/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.133223 4762 scope.go:117] "RemoveContainer" containerID="03bedf90d9de4202da4df646416d5c25cf7f7c0b4f1a31a1cfc7b603b022827f" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.160495 4762 scope.go:117] "RemoveContainer" containerID="47c45593fb8aba9e37a2a183212858aca006aa1eb329e1e177dd0ccb9fe0095a" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.250983 4762 scope.go:117] "RemoveContainer" containerID="887e15ad19fc27a12c37952a2b9950f8a8812e9e7a0510cec185fc9d3fd62b66" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.276407 4762 scope.go:117] "RemoveContainer" containerID="6891113cf2d6697324e6a167a135f0c060a38fb3d450da77bda9de60f207c8f2" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.300914 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/extract-content/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.312061 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/extract-utilities/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.322260 4762 scope.go:117] "RemoveContainer" containerID="34cc702e78165783238ac76fa93e6b1533c509faaf06d4e865695cada48f2d68" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.341323 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/extract-content/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.361088 4762 scope.go:117] "RemoveContainer" containerID="cd1e6e1172c720beeffc6bfbd56af158da86b64d766a642b82e86e719c4d0803" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.567570 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/extract-utilities/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.650057 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/registry-server/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.676691 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8przg_197d8c37-eac6-4f4a-9f95-fa1da2ff23e7/extract-content/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.684444 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/extract-utilities/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.808431 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/extract-utilities/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.857679 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/extract-content/0.log" Feb 17 14:40:27 crc kubenswrapper[4762]: I0217 14:40:27.882795 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/extract-content/0.log" Feb 17 14:40:28 crc kubenswrapper[4762]: I0217 14:40:28.096815 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/extract-utilities/0.log" Feb 17 14:40:28 crc kubenswrapper[4762]: I0217 14:40:28.102628 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/extract-content/0.log" Feb 17 14:40:28 crc kubenswrapper[4762]: I0217 14:40:28.439248 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g66qj_440d9e9b-109c-4794-93b8-e18e3232ad49/registry-server/0.log" Feb 17 14:40:36 crc kubenswrapper[4762]: I0217 14:40:36.051178 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5mknf"] Feb 17 14:40:36 crc kubenswrapper[4762]: I0217 14:40:36.065972 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-lq7n6"] Feb 17 14:40:36 crc kubenswrapper[4762]: I0217 14:40:36.083240 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5mknf"] Feb 17 14:40:36 crc kubenswrapper[4762]: I0217 14:40:36.090830 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-lq7n6"] Feb 17 14:40:38 crc kubenswrapper[4762]: I0217 14:40:38.091818 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53984f9c-be03-44a6-91da-65972a4b4cd5" path="/var/lib/kubelet/pods/53984f9c-be03-44a6-91da-65972a4b4cd5/volumes" Feb 17 14:40:38 crc kubenswrapper[4762]: I0217 14:40:38.094932 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64" path="/var/lib/kubelet/pods/8c5a32ed-9d71-4bb0-b72f-f7ac5b55fa64/volumes" Feb 17 14:40:40 crc kubenswrapper[4762]: I0217 14:40:40.788299 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-csbmw_d135e9df-e707-48e4-a0ad-0d400cb5b0c8/prometheus-operator/0.log" Feb 17 14:40:40 crc kubenswrapper[4762]: I0217 14:40:40.808271 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86644c88f-l5r9r_d126b4fc-9d8e-4886-8f76-53268a51258b/prometheus-operator-admission-webhook/0.log" Feb 17 14:40:40 crc kubenswrapper[4762]: I0217 14:40:40.877093 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86644c88f-xgzjx_77607659-a202-47d9-8358-aa339e9ce99d/prometheus-operator-admission-webhook/0.log" Feb 17 14:40:40 crc kubenswrapper[4762]: I0217 14:40:40.994225 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-fb6t4_5d34e0ae-c3d1-4d05-8a59-ca531de00d98/operator/0.log" Feb 17 14:40:41 crc kubenswrapper[4762]: I0217 14:40:41.038500 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-656mp_0e153059-08c6-4155-af14-f724a156b6fd/observability-ui-dashboards/0.log" Feb 17 14:40:41 crc kubenswrapper[4762]: I0217 14:40:41.101437 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-788lp_9decd9a9-2c51-42dc-8fed-78efbe4c828e/perses-operator/0.log" Feb 17 14:40:53 crc kubenswrapper[4762]: I0217 14:40:53.055759 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-95lkq"] Feb 17 14:40:53 crc kubenswrapper[4762]: I0217 14:40:53.071867 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-h7qp8"] Feb 17 14:40:53 crc kubenswrapper[4762]: I0217 14:40:53.081868 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-95lkq"] Feb 17 14:40:53 crc kubenswrapper[4762]: I0217 14:40:53.134431 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-smktq"] Feb 17 14:40:53 crc kubenswrapper[4762]: I0217 14:40:53.147986 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-smktq"] Feb 17 14:40:53 crc kubenswrapper[4762]: I0217 14:40:53.162262 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-h7qp8"] Feb 17 14:40:54 crc kubenswrapper[4762]: I0217 14:40:54.090485 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3" path="/var/lib/kubelet/pods/8a5b150f-b06b-45f2-be43-0b0ed9e6b7e3/volumes" Feb 17 14:40:54 crc kubenswrapper[4762]: I0217 14:40:54.091698 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c276b7-cca9-42c7-8605-5f2bfa0da0e1" path="/var/lib/kubelet/pods/a9c276b7-cca9-42c7-8605-5f2bfa0da0e1/volumes" Feb 17 14:40:54 crc kubenswrapper[4762]: I0217 14:40:54.092848 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ea0210-709e-4a47-87d1-48c811c0ab85" path="/var/lib/kubelet/pods/d6ea0210-709e-4a47-87d1-48c811c0ab85/volumes" Feb 17 14:40:54 crc kubenswrapper[4762]: I0217 14:40:54.610582 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59cfb98864-gc6tj_425e262b-13e9-474a-85f5-1a0501569aa9/kube-rbac-proxy/0.log" Feb 17 14:40:54 crc kubenswrapper[4762]: I0217 14:40:54.657122 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59cfb98864-gc6tj_425e262b-13e9-474a-85f5-1a0501569aa9/manager/0.log" Feb 17 14:41:04 crc kubenswrapper[4762]: E0217 14:41:04.746836 4762 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.214:43710->38.102.83.214:37405: write tcp 38.102.83.214:43710->38.102.83.214:37405: write: broken pipe Feb 17 14:41:27 crc kubenswrapper[4762]: I0217 14:41:27.529232 4762 scope.go:117] "RemoveContainer" containerID="c6759c99c71e5d3d5fe8cf99a1ee57341afec410927c40befc9081b3cbae7a1e" Feb 17 14:41:27 crc kubenswrapper[4762]: I0217 14:41:27.592330 4762 scope.go:117] "RemoveContainer" containerID="3fb17ebbd8e146f643a15b507ad009691f75a0af1f916266e833930bfdc95b3a" Feb 17 14:41:27 crc kubenswrapper[4762]: I0217 14:41:27.797852 4762 scope.go:117] "RemoveContainer" containerID="e6e299e92349cffa5cd65ef41d287abc4aa99b44f8b6799fabb9fa73461b3607" Feb 17 14:41:27 crc kubenswrapper[4762]: I0217 14:41:27.866224 4762 scope.go:117] "RemoveContainer" containerID="f865c92eac1476eafc2c0c30e7afe7ee2571d6f3d907e473e0ff9d179a5c8edf" Feb 17 14:41:27 crc kubenswrapper[4762]: I0217 14:41:27.962346 4762 scope.go:117] "RemoveContainer" containerID="17aab810c353d27f1546f39fc1e9219e77f96483a29332f4c8a4803d99560833" Feb 17 14:41:51 crc kubenswrapper[4762]: I0217 14:41:51.053538 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jljhd"] Feb 17 14:41:51 crc kubenswrapper[4762]: I0217 14:41:51.078678 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0142-account-create-update-9mv69"] Feb 17 14:41:51 crc kubenswrapper[4762]: I0217 14:41:51.090066 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0142-account-create-update-9mv69"] Feb 17 14:41:51 crc kubenswrapper[4762]: I0217 14:41:51.101314 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jljhd"] Feb 17 14:41:52 crc kubenswrapper[4762]: I0217 14:41:52.042257 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nnss4"] Feb 17 14:41:52 crc kubenswrapper[4762]: I0217 14:41:52.053671 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nnss4"] Feb 17 14:41:52 crc kubenswrapper[4762]: I0217 14:41:52.089701 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277ee237-c640-42ab-8439-d23e72f087e1" path="/var/lib/kubelet/pods/277ee237-c640-42ab-8439-d23e72f087e1/volumes" Feb 17 14:41:52 crc kubenswrapper[4762]: I0217 14:41:52.091532 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8711f3-a902-4c23-8c91-3e8819cc74ca" path="/var/lib/kubelet/pods/bb8711f3-a902-4c23-8c91-3e8819cc74ca/volumes" Feb 17 14:41:52 crc kubenswrapper[4762]: I0217 14:41:52.093057 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da99eccd-0482-4e64-bb27-6b87437ae8ba" path="/var/lib/kubelet/pods/da99eccd-0482-4e64-bb27-6b87437ae8ba/volumes" Feb 17 14:41:53 crc kubenswrapper[4762]: I0217 14:41:53.047903 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kz5nv"] Feb 17 14:41:53 crc kubenswrapper[4762]: I0217 14:41:53.390175 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9c9e-account-create-update-2865f"] Feb 17 14:41:53 crc kubenswrapper[4762]: I0217 14:41:53.408061 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8886-account-create-update-w9f55"] Feb 17 14:41:53 crc kubenswrapper[4762]: I0217 14:41:53.419396 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8886-account-create-update-w9f55"] Feb 17 14:41:53 crc kubenswrapper[4762]: I0217 14:41:53.430148 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kz5nv"] Feb 17 14:41:53 crc kubenswrapper[4762]: I0217 14:41:53.443349 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9c9e-account-create-update-2865f"] Feb 17 14:41:54 crc kubenswrapper[4762]: I0217 14:41:54.094200 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8" path="/var/lib/kubelet/pods/8d240d3d-d93f-4185-a6fd-5a4ba25eb5a8/volumes" Feb 17 14:41:54 crc kubenswrapper[4762]: I0217 14:41:54.096170 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bb5440-4045-43cc-acbd-a61bc6b8efa7" path="/var/lib/kubelet/pods/b6bb5440-4045-43cc-acbd-a61bc6b8efa7/volumes" Feb 17 14:41:54 crc kubenswrapper[4762]: I0217 14:41:54.099596 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5fb9f5e-d096-4b3d-82cb-881bcc844cab" path="/var/lib/kubelet/pods/d5fb9f5e-d096-4b3d-82cb-881bcc844cab/volumes" Feb 17 14:41:54 crc kubenswrapper[4762]: I0217 14:41:54.621311 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:41:54 crc kubenswrapper[4762]: I0217 14:41:54.622699 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.442970 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fxmn2"] Feb 17 14:42:18 crc kubenswrapper[4762]: E0217 14:42:18.448201 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" containerName="container-00" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.448343 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" containerName="container-00" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.449026 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae96d33-ebf9-4885-aaf1-dac1acf5eb18" containerName="container-00" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.455030 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.477087 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxmn2"] Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.812179 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2thwg\" (UniqueName: \"kubernetes.io/projected/64dbab0b-c7c0-4749-88c4-d80ebe954a47-kube-api-access-2thwg\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.812660 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-catalog-content\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.813297 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-utilities\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.916981 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2thwg\" (UniqueName: \"kubernetes.io/projected/64dbab0b-c7c0-4749-88c4-d80ebe954a47-kube-api-access-2thwg\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.917122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-catalog-content\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.917450 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-utilities\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.917804 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-catalog-content\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.918098 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-utilities\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:18 crc kubenswrapper[4762]: I0217 14:42:18.953579 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2thwg\" (UniqueName: \"kubernetes.io/projected/64dbab0b-c7c0-4749-88c4-d80ebe954a47-kube-api-access-2thwg\") pod \"certified-operators-fxmn2\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:19 crc kubenswrapper[4762]: I0217 14:42:19.087359 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:19 crc kubenswrapper[4762]: I0217 14:42:19.918698 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxmn2"] Feb 17 14:42:20 crc kubenswrapper[4762]: I0217 14:42:20.497695 4762 generic.go:334] "Generic (PLEG): container finished" podID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerID="8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8" exitCode=0 Feb 17 14:42:20 crc kubenswrapper[4762]: I0217 14:42:20.497796 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerDied","Data":"8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8"} Feb 17 14:42:20 crc kubenswrapper[4762]: I0217 14:42:20.498050 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerStarted","Data":"36e8abfddf0ba0dd39ad92ffa310a4cb311d07c196896cd399e4c7f7204100ee"} Feb 17 14:42:20 crc kubenswrapper[4762]: I0217 14:42:20.500751 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:42:21 crc kubenswrapper[4762]: I0217 14:42:21.511237 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerStarted","Data":"75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e"} Feb 17 14:42:23 crc kubenswrapper[4762]: I0217 14:42:23.689298 4762 generic.go:334] "Generic (PLEG): container finished" podID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerID="75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e" exitCode=0 Feb 17 14:42:23 crc kubenswrapper[4762]: I0217 14:42:23.689831 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerDied","Data":"75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e"} Feb 17 14:42:24 crc kubenswrapper[4762]: I0217 14:42:24.621599 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:42:24 crc kubenswrapper[4762]: I0217 14:42:24.622555 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:42:24 crc kubenswrapper[4762]: I0217 14:42:24.706746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerStarted","Data":"2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af"} Feb 17 14:42:24 crc kubenswrapper[4762]: I0217 14:42:24.733257 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fxmn2" podStartSLOduration=3.115520706 podStartE2EDuration="6.733207163s" podCreationTimestamp="2026-02-17 14:42:18 +0000 UTC" firstStartedPulling="2026-02-17 14:42:20.500328005 +0000 UTC m=+2221.080328657" lastFinishedPulling="2026-02-17 14:42:24.118014462 +0000 UTC m=+2224.698015114" observedRunningTime="2026-02-17 14:42:24.723262054 +0000 UTC m=+2225.303262736" watchObservedRunningTime="2026-02-17 14:42:24.733207163 +0000 UTC m=+2225.313207815" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.102213 4762 scope.go:117] "RemoveContainer" containerID="73297b536a093f8cfe7bdaf06c10d9fb0994bd62ea41652f37bfcbab4296d283" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.153146 4762 scope.go:117] "RemoveContainer" containerID="0114be74a9a7fafa9144c6bb345a89d6a976631f4269bd151a35887ce990a5c0" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.193790 4762 scope.go:117] "RemoveContainer" containerID="fcaecfe9e3ce19cb2373ae5e2053e815efa636d9678d4dffc4d12d0db7ebc9dd" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.243040 4762 scope.go:117] "RemoveContainer" containerID="8281960df4711a0ed57712cf1c3d31c153c2d3903dbfc30b5ee22eae721aeb48" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.294035 4762 scope.go:117] "RemoveContainer" containerID="c11054ab4bee3fbdac5eb4396c9b77028cc1f98238cda254ac44fa4f621f54e6" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.350498 4762 scope.go:117] "RemoveContainer" containerID="496cc57796dd27fdb322dce4f895bd33a74f61948764b2bbf10850f997eeef14" Feb 17 14:42:28 crc kubenswrapper[4762]: I0217 14:42:28.410810 4762 scope.go:117] "RemoveContainer" containerID="33f97202480ecfda56e480dc6249c5de214583f94de8cfdbe0667c9701d847ce" Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.057604 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7x82n"] Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.074151 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7x82n"] Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.103525 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.103583 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.161943 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.811942 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:29 crc kubenswrapper[4762]: I0217 14:42:29.871195 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxmn2"] Feb 17 14:42:30 crc kubenswrapper[4762]: I0217 14:42:30.087604 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bb66fd-cea7-435b-8915-0641110c25af" path="/var/lib/kubelet/pods/92bb66fd-cea7-435b-8915-0641110c25af/volumes" Feb 17 14:42:31 crc kubenswrapper[4762]: I0217 14:42:31.786569 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fxmn2" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="registry-server" containerID="cri-o://2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af" gracePeriod=2 Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.351457 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.502997 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-catalog-content\") pod \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.503161 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2thwg\" (UniqueName: \"kubernetes.io/projected/64dbab0b-c7c0-4749-88c4-d80ebe954a47-kube-api-access-2thwg\") pod \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.503243 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-utilities\") pod \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\" (UID: \"64dbab0b-c7c0-4749-88c4-d80ebe954a47\") " Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.505138 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-utilities" (OuterVolumeSpecName: "utilities") pod "64dbab0b-c7c0-4749-88c4-d80ebe954a47" (UID: "64dbab0b-c7c0-4749-88c4-d80ebe954a47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.513598 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64dbab0b-c7c0-4749-88c4-d80ebe954a47-kube-api-access-2thwg" (OuterVolumeSpecName: "kube-api-access-2thwg") pod "64dbab0b-c7c0-4749-88c4-d80ebe954a47" (UID: "64dbab0b-c7c0-4749-88c4-d80ebe954a47"). InnerVolumeSpecName "kube-api-access-2thwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.565954 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64dbab0b-c7c0-4749-88c4-d80ebe954a47" (UID: "64dbab0b-c7c0-4749-88c4-d80ebe954a47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.606879 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.606924 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2thwg\" (UniqueName: \"kubernetes.io/projected/64dbab0b-c7c0-4749-88c4-d80ebe954a47-kube-api-access-2thwg\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.606942 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64dbab0b-c7c0-4749-88c4-d80ebe954a47-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.798522 4762 generic.go:334] "Generic (PLEG): container finished" podID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerID="2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af" exitCode=0 Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.798587 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxmn2" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.798598 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerDied","Data":"2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af"} Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.798949 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxmn2" event={"ID":"64dbab0b-c7c0-4749-88c4-d80ebe954a47","Type":"ContainerDied","Data":"36e8abfddf0ba0dd39ad92ffa310a4cb311d07c196896cd399e4c7f7204100ee"} Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.798970 4762 scope.go:117] "RemoveContainer" containerID="2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.825498 4762 scope.go:117] "RemoveContainer" containerID="75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.849565 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxmn2"] Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.858316 4762 scope.go:117] "RemoveContainer" containerID="8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.863190 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fxmn2"] Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.930184 4762 scope.go:117] "RemoveContainer" containerID="2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af" Feb 17 14:42:32 crc kubenswrapper[4762]: E0217 14:42:32.930707 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af\": container with ID starting with 2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af not found: ID does not exist" containerID="2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.930752 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af"} err="failed to get container status \"2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af\": rpc error: code = NotFound desc = could not find container \"2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af\": container with ID starting with 2cc48fd30bca6122f5a993f270ef1889ff53dde05ea8989ec07030ace8eeb6af not found: ID does not exist" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.930774 4762 scope.go:117] "RemoveContainer" containerID="75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e" Feb 17 14:42:32 crc kubenswrapper[4762]: E0217 14:42:32.931150 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e\": container with ID starting with 75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e not found: ID does not exist" containerID="75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.931179 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e"} err="failed to get container status \"75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e\": rpc error: code = NotFound desc = could not find container \"75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e\": container with ID starting with 75799469120c8ef98646fcbb4b10d138cf9170aedd25228f080684fef87e003e not found: ID does not exist" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.931198 4762 scope.go:117] "RemoveContainer" containerID="8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8" Feb 17 14:42:32 crc kubenswrapper[4762]: E0217 14:42:32.931611 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8\": container with ID starting with 8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8 not found: ID does not exist" containerID="8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8" Feb 17 14:42:32 crc kubenswrapper[4762]: I0217 14:42:32.931781 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8"} err="failed to get container status \"8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8\": rpc error: code = NotFound desc = could not find container \"8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8\": container with ID starting with 8d5c1eed63dc14641d22ea2cb574d72bda912bed13fbe1bf5553c4cd4b3519f8 not found: ID does not exist" Feb 17 14:42:34 crc kubenswrapper[4762]: I0217 14:42:34.086483 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" path="/var/lib/kubelet/pods/64dbab0b-c7c0-4749-88c4-d80ebe954a47/volumes" Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.034911 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-phqhg"] Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.049393 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0adb-account-create-update-v2qxg"] Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.059156 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0adb-account-create-update-v2qxg"] Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.068495 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-phqhg"] Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.089866 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55" path="/var/lib/kubelet/pods/3f1e3a4c-93df-4e5d-8b28-9ff7d0966c55/volumes" Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.092966 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5722df-f962-403c-abfa-793bc821be57" path="/var/lib/kubelet/pods/5b5722df-f962-403c-abfa-793bc821be57/volumes" Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.273317 4762 generic.go:334] "Generic (PLEG): container finished" podID="8bfff96d-6c90-4a80-9024-7539e414a009" containerID="80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6" exitCode=0 Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.273401 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" event={"ID":"8bfff96d-6c90-4a80-9024-7539e414a009","Type":"ContainerDied","Data":"80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6"} Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.274796 4762 scope.go:117] "RemoveContainer" containerID="80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6" Feb 17 14:42:52 crc kubenswrapper[4762]: I0217 14:42:52.449323 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xsj4g_must-gather-xb8ps_8bfff96d-6c90-4a80-9024-7539e414a009/gather/0.log" Feb 17 14:42:54 crc kubenswrapper[4762]: I0217 14:42:54.621542 4762 patch_prober.go:28] interesting pod/machine-config-daemon-rwhnp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:42:54 crc kubenswrapper[4762]: I0217 14:42:54.622007 4762 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:42:54 crc kubenswrapper[4762]: I0217 14:42:54.622072 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" Feb 17 14:42:54 crc kubenswrapper[4762]: I0217 14:42:54.623322 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077"} pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:42:54 crc kubenswrapper[4762]: I0217 14:42:54.623389 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerName="machine-config-daemon" containerID="cri-o://c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" gracePeriod=600 Feb 17 14:42:54 crc kubenswrapper[4762]: E0217 14:42:54.745383 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:42:55 crc kubenswrapper[4762]: I0217 14:42:55.310960 4762 generic.go:334] "Generic (PLEG): container finished" podID="3eb11ce5-3ff7-4743-a879-95285dae2998" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" exitCode=0 Feb 17 14:42:55 crc kubenswrapper[4762]: I0217 14:42:55.311047 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" event={"ID":"3eb11ce5-3ff7-4743-a879-95285dae2998","Type":"ContainerDied","Data":"c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077"} Feb 17 14:42:55 crc kubenswrapper[4762]: I0217 14:42:55.311329 4762 scope.go:117] "RemoveContainer" containerID="7866eecacac248138bc6cd774a1ac22e147432f4d4ced0c1eaa06720947d6b4f" Feb 17 14:42:55 crc kubenswrapper[4762]: I0217 14:42:55.312145 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:42:55 crc kubenswrapper[4762]: E0217 14:42:55.312487 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:43:00 crc kubenswrapper[4762]: I0217 14:43:00.731376 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xsj4g/must-gather-xb8ps"] Feb 17 14:43:00 crc kubenswrapper[4762]: I0217 14:43:00.732249 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="copy" containerID="cri-o://e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23" gracePeriod=2 Feb 17 14:43:00 crc kubenswrapper[4762]: I0217 14:43:00.742130 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xsj4g/must-gather-xb8ps"] Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.265571 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xsj4g_must-gather-xb8ps_8bfff96d-6c90-4a80-9024-7539e414a009/copy/0.log" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.266557 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.394316 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xsj4g_must-gather-xb8ps_8bfff96d-6c90-4a80-9024-7539e414a009/copy/0.log" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.394742 4762 generic.go:334] "Generic (PLEG): container finished" podID="8bfff96d-6c90-4a80-9024-7539e414a009" containerID="e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23" exitCode=143 Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.394804 4762 scope.go:117] "RemoveContainer" containerID="e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.394991 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xsj4g/must-gather-xb8ps" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.400753 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7gnl\" (UniqueName: \"kubernetes.io/projected/8bfff96d-6c90-4a80-9024-7539e414a009-kube-api-access-v7gnl\") pod \"8bfff96d-6c90-4a80-9024-7539e414a009\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.400818 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8bfff96d-6c90-4a80-9024-7539e414a009-must-gather-output\") pod \"8bfff96d-6c90-4a80-9024-7539e414a009\" (UID: \"8bfff96d-6c90-4a80-9024-7539e414a009\") " Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.436271 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bfff96d-6c90-4a80-9024-7539e414a009-kube-api-access-v7gnl" (OuterVolumeSpecName: "kube-api-access-v7gnl") pod "8bfff96d-6c90-4a80-9024-7539e414a009" (UID: "8bfff96d-6c90-4a80-9024-7539e414a009"). InnerVolumeSpecName "kube-api-access-v7gnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.447131 4762 scope.go:117] "RemoveContainer" containerID="80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.503932 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7gnl\" (UniqueName: \"kubernetes.io/projected/8bfff96d-6c90-4a80-9024-7539e414a009-kube-api-access-v7gnl\") on node \"crc\" DevicePath \"\"" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.575940 4762 scope.go:117] "RemoveContainer" containerID="e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23" Feb 17 14:43:01 crc kubenswrapper[4762]: E0217 14:43:01.576556 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23\": container with ID starting with e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23 not found: ID does not exist" containerID="e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.576599 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23"} err="failed to get container status \"e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23\": rpc error: code = NotFound desc = could not find container \"e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23\": container with ID starting with e42d72e77cb76cd2f2b3dfb4a4353f60e1e459bee4ce5c1d01764b12f93fab23 not found: ID does not exist" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.576626 4762 scope.go:117] "RemoveContainer" containerID="80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6" Feb 17 14:43:01 crc kubenswrapper[4762]: E0217 14:43:01.577254 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6\": container with ID starting with 80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6 not found: ID does not exist" containerID="80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.577281 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6"} err="failed to get container status \"80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6\": rpc error: code = NotFound desc = could not find container \"80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6\": container with ID starting with 80c60f35b00598555dbdf1787dc0f33f9781ddac9d8801ae585dfe4dc9f502d6 not found: ID does not exist" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.621229 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bfff96d-6c90-4a80-9024-7539e414a009-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8bfff96d-6c90-4a80-9024-7539e414a009" (UID: "8bfff96d-6c90-4a80-9024-7539e414a009"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:43:01 crc kubenswrapper[4762]: I0217 14:43:01.710679 4762 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8bfff96d-6c90-4a80-9024-7539e414a009-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 14:43:02 crc kubenswrapper[4762]: I0217 14:43:02.101521 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" path="/var/lib/kubelet/pods/8bfff96d-6c90-4a80-9024-7539e414a009/volumes" Feb 17 14:43:04 crc kubenswrapper[4762]: I0217 14:43:04.057167 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wdbb8"] Feb 17 14:43:04 crc kubenswrapper[4762]: I0217 14:43:04.091593 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wdbb8"] Feb 17 14:43:06 crc kubenswrapper[4762]: I0217 14:43:06.043746 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9zsnn"] Feb 17 14:43:06 crc kubenswrapper[4762]: I0217 14:43:06.056466 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9zsnn"] Feb 17 14:43:06 crc kubenswrapper[4762]: I0217 14:43:06.071545 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:43:06 crc kubenswrapper[4762]: E0217 14:43:06.071983 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:43:06 crc kubenswrapper[4762]: I0217 14:43:06.082462 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae10efe-5821-4182-8f8b-bd9c6cc13a4d" path="/var/lib/kubelet/pods/5ae10efe-5821-4182-8f8b-bd9c6cc13a4d/volumes" Feb 17 14:43:06 crc kubenswrapper[4762]: I0217 14:43:06.085669 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4589d86-754e-46ec-bd8f-412abdf21890" path="/var/lib/kubelet/pods/a4589d86-754e-46ec-bd8f-412abdf21890/volumes" Feb 17 14:43:11 crc kubenswrapper[4762]: I0217 14:43:11.039839 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-fgpcm"] Feb 17 14:43:11 crc kubenswrapper[4762]: I0217 14:43:11.054290 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-fgpcm"] Feb 17 14:43:12 crc kubenswrapper[4762]: I0217 14:43:12.097072 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82cbcf38-171c-4676-988f-a742b4277bb6" path="/var/lib/kubelet/pods/82cbcf38-171c-4676-988f-a742b4277bb6/volumes" Feb 17 14:43:19 crc kubenswrapper[4762]: I0217 14:43:19.072201 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:43:19 crc kubenswrapper[4762]: E0217 14:43:19.074407 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:43:28 crc kubenswrapper[4762]: I0217 14:43:28.586557 4762 scope.go:117] "RemoveContainer" containerID="d8df3855e0f6149ffd61f131162f7a26a55a32bd0885c8d0d06c0ea10669f091" Feb 17 14:43:28 crc kubenswrapper[4762]: I0217 14:43:28.642420 4762 scope.go:117] "RemoveContainer" containerID="0a7db91915ffc089979e848f81e2557ee1f9543eceec4a23d5f5ea6017f3e657" Feb 17 14:43:28 crc kubenswrapper[4762]: I0217 14:43:28.721450 4762 scope.go:117] "RemoveContainer" containerID="73161d86078c8db13cbff44883dd9f44405ed482a55af875f557eee2037e6468" Feb 17 14:43:28 crc kubenswrapper[4762]: I0217 14:43:28.753920 4762 scope.go:117] "RemoveContainer" containerID="561cbb4ba0f490708913ac6ccd73f550bfd7b006b2b4821a8959f193b20c40bb" Feb 17 14:43:28 crc kubenswrapper[4762]: I0217 14:43:28.853067 4762 scope.go:117] "RemoveContainer" containerID="745b57e6bf2efa1b71aa23513113a2fb00baba1fc7cb99b978eda5e9db9a2354" Feb 17 14:43:28 crc kubenswrapper[4762]: I0217 14:43:28.957856 4762 scope.go:117] "RemoveContainer" containerID="6b56d7029a2965e5de4afa01619427cb94928a5a2b9f8f1aa928695001e8cc1d" Feb 17 14:43:33 crc kubenswrapper[4762]: I0217 14:43:33.071031 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:43:33 crc kubenswrapper[4762]: E0217 14:43:33.071942 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:43:44 crc kubenswrapper[4762]: I0217 14:43:44.071750 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:43:44 crc kubenswrapper[4762]: E0217 14:43:44.072791 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:43:46 crc kubenswrapper[4762]: I0217 14:43:46.088322 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hmbsl"] Feb 17 14:43:46 crc kubenswrapper[4762]: I0217 14:43:46.088807 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hmbsl"] Feb 17 14:43:48 crc kubenswrapper[4762]: I0217 14:43:48.086150 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15862fc-7a11-484e-8343-c565ddcc60eb" path="/var/lib/kubelet/pods/c15862fc-7a11-484e-8343-c565ddcc60eb/volumes" Feb 17 14:43:55 crc kubenswrapper[4762]: I0217 14:43:55.072516 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:43:55 crc kubenswrapper[4762]: E0217 14:43:55.073612 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.963917 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzck5"] Feb 17 14:43:59 crc kubenswrapper[4762]: E0217 14:43:59.965059 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="extract-utilities" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965083 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="extract-utilities" Feb 17 14:43:59 crc kubenswrapper[4762]: E0217 14:43:59.965117 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="registry-server" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965126 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="registry-server" Feb 17 14:43:59 crc kubenswrapper[4762]: E0217 14:43:59.965149 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="gather" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965157 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="gather" Feb 17 14:43:59 crc kubenswrapper[4762]: E0217 14:43:59.965177 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="extract-content" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965184 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="extract-content" Feb 17 14:43:59 crc kubenswrapper[4762]: E0217 14:43:59.965212 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="copy" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965221 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="copy" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965519 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="copy" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965538 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="64dbab0b-c7c0-4749-88c4-d80ebe954a47" containerName="registry-server" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.965569 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfff96d-6c90-4a80-9024-7539e414a009" containerName="gather" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.967747 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:43:59 crc kubenswrapper[4762]: I0217 14:43:59.985089 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzck5"] Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.134479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-utilities\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.134636 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-catalog-content\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.134747 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph2cl\" (UniqueName: \"kubernetes.io/projected/da02678b-6749-4871-af58-b8f3d3205752-kube-api-access-ph2cl\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.236759 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph2cl\" (UniqueName: \"kubernetes.io/projected/da02678b-6749-4871-af58-b8f3d3205752-kube-api-access-ph2cl\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.236945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-utilities\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.237093 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-catalog-content\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.237885 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-catalog-content\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.237920 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-utilities\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.257330 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph2cl\" (UniqueName: \"kubernetes.io/projected/da02678b-6749-4871-af58-b8f3d3205752-kube-api-access-ph2cl\") pod \"community-operators-kzck5\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.292911 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:00 crc kubenswrapper[4762]: I0217 14:44:00.901440 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzck5"] Feb 17 14:44:01 crc kubenswrapper[4762]: I0217 14:44:01.242749 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerStarted","Data":"0898ca2d26e5c5244a5858af236e402bd2f1e7e73241994a8a049a8033e17821"} Feb 17 14:44:02 crc kubenswrapper[4762]: I0217 14:44:02.259679 4762 generic.go:334] "Generic (PLEG): container finished" podID="da02678b-6749-4871-af58-b8f3d3205752" containerID="0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974" exitCode=0 Feb 17 14:44:02 crc kubenswrapper[4762]: I0217 14:44:02.259901 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerDied","Data":"0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974"} Feb 17 14:44:03 crc kubenswrapper[4762]: I0217 14:44:03.293191 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerStarted","Data":"811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751"} Feb 17 14:44:05 crc kubenswrapper[4762]: I0217 14:44:05.323499 4762 generic.go:334] "Generic (PLEG): container finished" podID="da02678b-6749-4871-af58-b8f3d3205752" containerID="811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751" exitCode=0 Feb 17 14:44:05 crc kubenswrapper[4762]: I0217 14:44:05.323554 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerDied","Data":"811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751"} Feb 17 14:44:06 crc kubenswrapper[4762]: I0217 14:44:06.344021 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerStarted","Data":"9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5"} Feb 17 14:44:06 crc kubenswrapper[4762]: I0217 14:44:06.373944 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzck5" podStartSLOduration=3.699852984 podStartE2EDuration="7.373921406s" podCreationTimestamp="2026-02-17 14:43:59 +0000 UTC" firstStartedPulling="2026-02-17 14:44:02.263784518 +0000 UTC m=+2322.843785170" lastFinishedPulling="2026-02-17 14:44:05.93785291 +0000 UTC m=+2326.517853592" observedRunningTime="2026-02-17 14:44:06.37112899 +0000 UTC m=+2326.951129642" watchObservedRunningTime="2026-02-17 14:44:06.373921406 +0000 UTC m=+2326.953922058" Feb 17 14:44:07 crc kubenswrapper[4762]: I0217 14:44:07.072232 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:44:07 crc kubenswrapper[4762]: E0217 14:44:07.072623 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:44:10 crc kubenswrapper[4762]: I0217 14:44:10.293960 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:10 crc kubenswrapper[4762]: I0217 14:44:10.295903 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:10 crc kubenswrapper[4762]: I0217 14:44:10.365773 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:11 crc kubenswrapper[4762]: I0217 14:44:11.450316 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:11 crc kubenswrapper[4762]: I0217 14:44:11.514820 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzck5"] Feb 17 14:44:13 crc kubenswrapper[4762]: I0217 14:44:13.444194 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzck5" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="registry-server" containerID="cri-o://9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5" gracePeriod=2 Feb 17 14:44:13 crc kubenswrapper[4762]: I0217 14:44:13.983973 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.284346 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-utilities\") pod \"da02678b-6749-4871-af58-b8f3d3205752\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.284664 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph2cl\" (UniqueName: \"kubernetes.io/projected/da02678b-6749-4871-af58-b8f3d3205752-kube-api-access-ph2cl\") pod \"da02678b-6749-4871-af58-b8f3d3205752\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.284743 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-catalog-content\") pod \"da02678b-6749-4871-af58-b8f3d3205752\" (UID: \"da02678b-6749-4871-af58-b8f3d3205752\") " Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.290282 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-utilities" (OuterVolumeSpecName: "utilities") pod "da02678b-6749-4871-af58-b8f3d3205752" (UID: "da02678b-6749-4871-af58-b8f3d3205752"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.294986 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da02678b-6749-4871-af58-b8f3d3205752-kube-api-access-ph2cl" (OuterVolumeSpecName: "kube-api-access-ph2cl") pod "da02678b-6749-4871-af58-b8f3d3205752" (UID: "da02678b-6749-4871-af58-b8f3d3205752"). InnerVolumeSpecName "kube-api-access-ph2cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.339806 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da02678b-6749-4871-af58-b8f3d3205752" (UID: "da02678b-6749-4871-af58-b8f3d3205752"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.388438 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.388500 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph2cl\" (UniqueName: \"kubernetes.io/projected/da02678b-6749-4871-af58-b8f3d3205752-kube-api-access-ph2cl\") on node \"crc\" DevicePath \"\"" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.388513 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da02678b-6749-4871-af58-b8f3d3205752-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.461812 4762 generic.go:334] "Generic (PLEG): container finished" podID="da02678b-6749-4871-af58-b8f3d3205752" containerID="9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5" exitCode=0 Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.461888 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerDied","Data":"9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5"} Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.461980 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzck5" event={"ID":"da02678b-6749-4871-af58-b8f3d3205752","Type":"ContainerDied","Data":"0898ca2d26e5c5244a5858af236e402bd2f1e7e73241994a8a049a8033e17821"} Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.462011 4762 scope.go:117] "RemoveContainer" containerID="9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.463266 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzck5" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.504007 4762 scope.go:117] "RemoveContainer" containerID="811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.517713 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzck5"] Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.526843 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzck5"] Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.541987 4762 scope.go:117] "RemoveContainer" containerID="0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.580029 4762 scope.go:117] "RemoveContainer" containerID="9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5" Feb 17 14:44:14 crc kubenswrapper[4762]: E0217 14:44:14.580632 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5\": container with ID starting with 9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5 not found: ID does not exist" containerID="9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.580701 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5"} err="failed to get container status \"9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5\": rpc error: code = NotFound desc = could not find container \"9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5\": container with ID starting with 9acd0a492ec73666d28440d033178839b6edbd207c28d4cbdfc8a91811c22ac5 not found: ID does not exist" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.580733 4762 scope.go:117] "RemoveContainer" containerID="811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751" Feb 17 14:44:14 crc kubenswrapper[4762]: E0217 14:44:14.583769 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751\": container with ID starting with 811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751 not found: ID does not exist" containerID="811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.583813 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751"} err="failed to get container status \"811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751\": rpc error: code = NotFound desc = could not find container \"811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751\": container with ID starting with 811568f1ea1214c47dd11e077bc4cfbf3d3e726f2d276b5e9deec27afe655751 not found: ID does not exist" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.583843 4762 scope.go:117] "RemoveContainer" containerID="0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974" Feb 17 14:44:14 crc kubenswrapper[4762]: E0217 14:44:14.584303 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974\": container with ID starting with 0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974 not found: ID does not exist" containerID="0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974" Feb 17 14:44:14 crc kubenswrapper[4762]: I0217 14:44:14.584344 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974"} err="failed to get container status \"0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974\": rpc error: code = NotFound desc = could not find container \"0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974\": container with ID starting with 0065c6083b608b95dda8672016fb1dece0c80f1528ce013d492d7eaf5f4b7974 not found: ID does not exist" Feb 17 14:44:16 crc kubenswrapper[4762]: I0217 14:44:16.095130 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da02678b-6749-4871-af58-b8f3d3205752" path="/var/lib/kubelet/pods/da02678b-6749-4871-af58-b8f3d3205752/volumes" Feb 17 14:44:20 crc kubenswrapper[4762]: I0217 14:44:20.086875 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:44:20 crc kubenswrapper[4762]: E0217 14:44:20.088154 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:44:29 crc kubenswrapper[4762]: I0217 14:44:29.136700 4762 scope.go:117] "RemoveContainer" containerID="5eec962dd211446ef8a8f7d17ba4922b5ce36ef85cec693ce7a62710fce9a4f5" Feb 17 14:44:35 crc kubenswrapper[4762]: I0217 14:44:35.071506 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:44:35 crc kubenswrapper[4762]: E0217 14:44:35.074524 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:44:46 crc kubenswrapper[4762]: I0217 14:44:46.070667 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:44:46 crc kubenswrapper[4762]: E0217 14:44:46.071622 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:44:57 crc kubenswrapper[4762]: I0217 14:44:57.071414 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:44:57 crc kubenswrapper[4762]: E0217 14:44:57.072566 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.181722 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l"] Feb 17 14:45:00 crc kubenswrapper[4762]: E0217 14:45:00.182761 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="extract-utilities" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.182783 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="extract-utilities" Feb 17 14:45:00 crc kubenswrapper[4762]: E0217 14:45:00.182805 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="extract-content" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.182812 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="extract-content" Feb 17 14:45:00 crc kubenswrapper[4762]: E0217 14:45:00.182846 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="registry-server" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.182856 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="registry-server" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.183113 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="da02678b-6749-4871-af58-b8f3d3205752" containerName="registry-server" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.184101 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.186465 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.186897 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.203514 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l"] Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.280914 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5612c15-7bc4-4ee0-93cf-955c52187af2-config-volume\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.281130 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnmkj\" (UniqueName: \"kubernetes.io/projected/d5612c15-7bc4-4ee0-93cf-955c52187af2-kube-api-access-vnmkj\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.281371 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5612c15-7bc4-4ee0-93cf-955c52187af2-secret-volume\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.383871 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnmkj\" (UniqueName: \"kubernetes.io/projected/d5612c15-7bc4-4ee0-93cf-955c52187af2-kube-api-access-vnmkj\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.384029 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5612c15-7bc4-4ee0-93cf-955c52187af2-secret-volume\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.384093 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5612c15-7bc4-4ee0-93cf-955c52187af2-config-volume\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.385126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5612c15-7bc4-4ee0-93cf-955c52187af2-config-volume\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.393487 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5612c15-7bc4-4ee0-93cf-955c52187af2-secret-volume\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.403269 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnmkj\" (UniqueName: \"kubernetes.io/projected/d5612c15-7bc4-4ee0-93cf-955c52187af2-kube-api-access-vnmkj\") pod \"collect-profiles-29522325-ww27l\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:00 crc kubenswrapper[4762]: I0217 14:45:00.530991 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:01 crc kubenswrapper[4762]: I0217 14:45:01.046261 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l"] Feb 17 14:45:01 crc kubenswrapper[4762]: I0217 14:45:01.827308 4762 generic.go:334] "Generic (PLEG): container finished" podID="d5612c15-7bc4-4ee0-93cf-955c52187af2" containerID="61c70425da1e5717e3533052ed9c0b348328a323e4a3d469f61c05b9a7800785" exitCode=0 Feb 17 14:45:01 crc kubenswrapper[4762]: I0217 14:45:01.827426 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" event={"ID":"d5612c15-7bc4-4ee0-93cf-955c52187af2","Type":"ContainerDied","Data":"61c70425da1e5717e3533052ed9c0b348328a323e4a3d469f61c05b9a7800785"} Feb 17 14:45:01 crc kubenswrapper[4762]: I0217 14:45:01.827754 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" event={"ID":"d5612c15-7bc4-4ee0-93cf-955c52187af2","Type":"ContainerStarted","Data":"53de421671db89404bea7c87ab863c705302238dc30dd83326069d1c8c7433e6"} Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.262070 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.380102 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5612c15-7bc4-4ee0-93cf-955c52187af2-config-volume\") pod \"d5612c15-7bc4-4ee0-93cf-955c52187af2\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.380160 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5612c15-7bc4-4ee0-93cf-955c52187af2-secret-volume\") pod \"d5612c15-7bc4-4ee0-93cf-955c52187af2\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.380564 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnmkj\" (UniqueName: \"kubernetes.io/projected/d5612c15-7bc4-4ee0-93cf-955c52187af2-kube-api-access-vnmkj\") pod \"d5612c15-7bc4-4ee0-93cf-955c52187af2\" (UID: \"d5612c15-7bc4-4ee0-93cf-955c52187af2\") " Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.381268 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5612c15-7bc4-4ee0-93cf-955c52187af2-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5612c15-7bc4-4ee0-93cf-955c52187af2" (UID: "d5612c15-7bc4-4ee0-93cf-955c52187af2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.387160 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5612c15-7bc4-4ee0-93cf-955c52187af2-kube-api-access-vnmkj" (OuterVolumeSpecName: "kube-api-access-vnmkj") pod "d5612c15-7bc4-4ee0-93cf-955c52187af2" (UID: "d5612c15-7bc4-4ee0-93cf-955c52187af2"). InnerVolumeSpecName "kube-api-access-vnmkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.390098 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5612c15-7bc4-4ee0-93cf-955c52187af2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5612c15-7bc4-4ee0-93cf-955c52187af2" (UID: "d5612c15-7bc4-4ee0-93cf-955c52187af2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.483635 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnmkj\" (UniqueName: \"kubernetes.io/projected/d5612c15-7bc4-4ee0-93cf-955c52187af2-kube-api-access-vnmkj\") on node \"crc\" DevicePath \"\"" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.483700 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5612c15-7bc4-4ee0-93cf-955c52187af2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.483715 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5612c15-7bc4-4ee0-93cf-955c52187af2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.854465 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" event={"ID":"d5612c15-7bc4-4ee0-93cf-955c52187af2","Type":"ContainerDied","Data":"53de421671db89404bea7c87ab863c705302238dc30dd83326069d1c8c7433e6"} Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.854526 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53de421671db89404bea7c87ab863c705302238dc30dd83326069d1c8c7433e6" Feb 17 14:45:03 crc kubenswrapper[4762]: I0217 14:45:03.854565 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-ww27l" Feb 17 14:45:04 crc kubenswrapper[4762]: I0217 14:45:04.381372 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj"] Feb 17 14:45:04 crc kubenswrapper[4762]: I0217 14:45:04.395261 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-ppgsj"] Feb 17 14:45:06 crc kubenswrapper[4762]: I0217 14:45:06.094924 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f66bf06-e190-40a2-8503-9e4b5b2f65c6" path="/var/lib/kubelet/pods/3f66bf06-e190-40a2-8503-9e4b5b2f65c6/volumes" Feb 17 14:45:11 crc kubenswrapper[4762]: I0217 14:45:11.072441 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:45:11 crc kubenswrapper[4762]: E0217 14:45:11.073518 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:45:23 crc kubenswrapper[4762]: I0217 14:45:23.071538 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:45:23 crc kubenswrapper[4762]: E0217 14:45:23.072414 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:45:29 crc kubenswrapper[4762]: I0217 14:45:29.226233 4762 scope.go:117] "RemoveContainer" containerID="ca16c54075c1d04387ef3558088928141f7d5941473278a0cb4f2937f37c7ddc" Feb 17 14:45:37 crc kubenswrapper[4762]: I0217 14:45:37.072719 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:45:37 crc kubenswrapper[4762]: E0217 14:45:37.074198 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:45:48 crc kubenswrapper[4762]: I0217 14:45:48.071624 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:45:48 crc kubenswrapper[4762]: E0217 14:45:48.072499 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:46:01 crc kubenswrapper[4762]: I0217 14:46:01.071389 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:46:01 crc kubenswrapper[4762]: E0217 14:46:01.072516 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:46:16 crc kubenswrapper[4762]: I0217 14:46:16.071004 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:46:16 crc kubenswrapper[4762]: E0217 14:46:16.072404 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:46:27 crc kubenswrapper[4762]: I0217 14:46:27.071468 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:46:27 crc kubenswrapper[4762]: E0217 14:46:27.072484 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:46:42 crc kubenswrapper[4762]: I0217 14:46:42.073037 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:46:42 crc kubenswrapper[4762]: E0217 14:46:42.075688 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:46:53 crc kubenswrapper[4762]: I0217 14:46:53.070622 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:46:53 crc kubenswrapper[4762]: E0217 14:46:53.071607 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:47:08 crc kubenswrapper[4762]: I0217 14:47:08.072605 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:47:08 crc kubenswrapper[4762]: E0217 14:47:08.073708 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998" Feb 17 14:47:21 crc kubenswrapper[4762]: I0217 14:47:21.076016 4762 scope.go:117] "RemoveContainer" containerID="c6376dac88834bca2adaeb1edbe9eda17b48d4173f50892f18ee7690c57f9077" Feb 17 14:47:21 crc kubenswrapper[4762]: E0217 14:47:21.077456 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rwhnp_openshift-machine-config-operator(3eb11ce5-3ff7-4743-a879-95285dae2998)\"" pod="openshift-machine-config-operator/machine-config-daemon-rwhnp" podUID="3eb11ce5-3ff7-4743-a879-95285dae2998"